text
stringlengths
14
5.22M
meta
dict
__index_level_0__
int64
0
9.97k
Q: Question about a differentiable function at point $a$. Let $f$ be differentiable at point $a$. Prove than if $\lim \limits_{n \to \infty}x_n =\ a^{+}$ and $\lim \limits_{n \to \infty}y_n = a^{-}$ then $$\lim \limits_{n \to \infty} \frac{ f(x_n) - f(y_n)}{x_n - y_n} = f'(a)$$ I thought about proving this by using lagrange's theorem, but I didn't know where to go with this because the question has limit and its talking about one point, not an interval. Is there any ideas how to solve this question? A: Since $f$ is differentiable on $a$ then $$f(x_n)=f(a)+(x_n-a)f'(a)+(x_n-a)\epsilon_1(x_n)$$ where $\epsilon_1(x_n)\xrightarrow{n\to\infty}0$ and similarly we have $$f(y_n)=f(a)+(y_n-a)f'(a)+(y_n-a)\epsilon_2(y_n)$$ where $\epsilon_2(y_n)\xrightarrow{n\to\infty}0$. Now subtracting the two equalities and we get $$f(x_n)-f(y_n)=(x_n-y_n)f'(a)+\underbrace{(x_n-a)\epsilon_1(x_n)-(y_n-a)\epsilon_2(x_n)}_{=R_n}$$ $$R_n=(x_n-y_n)\epsilon_1(x_n)+(y_n-a)(\epsilon_1(x_n)-\epsilon_2(y_n))$$ and notice that $$0\le a-y_n=\underbrace{(a-x_n)}_{\le0}+(x_n-y_n)\le x_n-y_n$$ Can you take it from here? A: Using algebraic manipulation we may write $$\begin{align}\frac{f(x_n) - f(y_n)}{x_n - y_n} &= \frac{[f(x_n) - f(a)] - [f(y_n) - f(a)]}{x_n - y_n} = \frac{f(x_n) - f(a)}{x_n-y_n} - \frac{f(y_n) - f(a)}{x_n - y_n}\\&=\underbrace{\Bigg(\frac{x_n - a}{x_n - y_n}\Bigg)}_{t_n}\frac{f(x_n) - f(a)}{x_n-a} + \Bigg(\frac{(x_n - x_n) -(y_n - a)}{x_n - y_n}\Bigg)\frac{f(y_n) - f(a)}{y_n-a}\\&=t_n\underbrace{\frac{f(x_n) - f(a)}{x_n-a}}_{\to f'(a)} + \Big(1 - t_n\Big)\underbrace{\frac{f(y_n) - f(a)}{y_n-a}}_{\to f'(a)} \end{align}$$ Notice that $$y_n < a \Rightarrow x_n - a < x_n - y_n \Rightarrow \frac{x_n - a}{x_n - y_n} < 1$$ then the sequence $\{t_n\}$ is bounded and $0 < t_n < 1$. Then $$\lim_{n \to \infty}\frac{f(x_n) - f(y_n)}{x_n - y_n} = \lim_{n \to \infty}t_n\underbrace{\frac{f(x_n) - f(a)}{x_n-a}}_{\to f'(a)} + \Big(1 - t_n\Big)\underbrace{\frac{f(y_n) - f(a)}{y_n-a}}_{\to f'(a)} = f'(a)$$
{ "redpajama_set_name": "RedPajamaStackExchange" }
9,219
Facebook mobile advertising will be a game changer Christopher Reynolds | March 18, 2012 Last month saw social network giant Facebook finally announce its entry into mobile advertising, causing a mixture of intrigue, relief and anger. Intrigue because the last thing the mobile ad industry seems to need is more inventory, relief because according to Facebook's IPO mobile represents an investment strategy risk, and anger because, well, that's just the default reaction of most Facebook users to any new feature on the social network. Facebook's mobile ads tie into its new Premium ads solution, which allows brands to promote themselves directly in the user's news feed, as opposed to using traditional ads that display on the right-hand-side of the page. This solves Facebook's main problem of ads clogging-up real-estate on smaller mobile screens and also gives brands a much more visible and engaging channel. But will Facebook be able to replicate its online success in the mobile space? It didn't take long for Zuckerberg and Co to seriously shake-up the traditional online ad space, going from virtually nothing to one of the biggest ad networks on the internet in the space of just a few years. According to eMarketer, the company is set to grab the largest share of online display market in 2012 for the second year running. So it's no surprise that the mobile ad industry is watching Facebook very closely. Will Facebook mobile ads be a game changer? More inventory As we mentioned above, the mobile ad industry is facing something of a demand-side crisis. Apps and mobile sites are being pumped out at a tremendous rate, but advertisers are still somewhat coy on mobile spend. So in the short term, Facebook's entry into the mobile ad world will be sure to drive down prices for advertisers across other mobile ad networks. Hopefully this will provide the impetus for more advertisers to get on-board with mobile, and for other ad networks and mobile ad tech companies to up their game. Location, Location Needless to say, Facebook offers an incredible level of demographic targeting when it comes to mobile and this will surely mean that other mobile ad platforms will need to innovate in order to keep up. But Facebook's most exciting targeting features will revolve around hyper-local ads. Borrell Associates are predicting local mobile ad spend to increase to $3.1 billion in 2013 and it doesn't take long to imagine the possibilities a platform like Facebook opens-up. Brands could offer incentivised check-ins at local branches ('check-in with four friends at a local restaurant and get a discount'), send hyper-local offers once a user checks-in to a shopping centre, and push local traffic to under-performing branches by offering branch-specific deals or coupons. Facebook has the critical mass to make hyper-local mobile promotions work a treat. In such a fragmented space, Facebook's mobile eco-system offers an neatly unified solution for advertisers to drive engagement. It has everything from apps to Places, as well as Events, mobile profiles, virtual currency and push notifications. Not to mention the fact that users could seamlessly continue an ad experience from their mobile to their desktop and vice versa, with each experience able to leverage its respective strengths. The biggest threat to Facebook's mobile offering in the short term will be user antipathy toward ads in their news feed. But according to some sources, Facebook's Premium ads are seeing already 5 to 10 times higher click through rate. In the long term, Facebook may find it needs to strike a careful balance when it comes to ads becoming more location-based and integrated with other aspect of the social network. Like Google, it needs to watch the "creepiness level". But either way, it's hard to see Facebook's entry into mobile ads being anything other than a success and indeed a game changer. AdjustMobile Measurement Partner aimtellBegin sending web push within minutes Performcb#1 Performance Marketing Network Worldwide Phiture We Help Apps Grow Strawberry SocialsAutomate the way you search through Instagram HeadwayMarketing Powered By Technology See all in Marketplace
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
7,610
SpoutINFO Livestock & Feed Business Campaign & Library Review & Upcoming Ukraine's 2012 corn output may fall due to unfavourable weather Due to unfavourable weather, Ukraine's 2012 corn yield is likely to fall by 1.0-1.2 tonnes per hectare against 2011, a senior weather forecaster said on Monday (July 16). Ukrainian farms sowed a record 4.6 million hectares for the 2012 corn harvest but the output is unlikely to reach last year's 22.7 million tonnes due to a severe drought which hit the leading corn-producing areas. "After last week's torrential rains in eastern, southern and central regions we can say that the soil drought has stopped," Mykola Kulbida, head of the state weather forecasting centre, told a news conference. "But a complex of negative factors has already affected the yield of corn which could be by 1.0-1.2 tonnes per hectare lower." Kulbida gave no forecast for the 2012 corn harvest while Tetyana Adamenko, head of the centre's agriculture department, told Reuters this month that the harvest could total 20-21 million tonnes. She also said the centre had lowered its outlook for the 2012 grain harvest to 42 million tonnes from a previous estimate of 44 million because of a smaller-than-expected corn crop. Forecasters said that poor weather could reduce the Ukrainian wheat harvest to about 12-14 million tonnes from 22.3 million in 2011. The Ukrainian government has said the 2012 harvest could fall to 45.3 million tonnes from a record 56.7 million tonnes in 2011 due to a drought during the sowing and severe frosts during the wintering. The Agriculture Ministry last week cut its outlook for the former Soviet republic's grain exports in the current season to 20 million tonnes from 22-23 million tonnes. Ukraine, which consumes 26 million tonnes of grain per season, exported 21.8 million tonnes of grain in 2011-12. A holistic approach for a zinc oxide-free piglet diet As most of us are now aware, the use of therapeutic doses of zinc oxide (ZnO) for preventing and controlling post-weaning diarrhea (PWD) in young piglets will be banned in animal f ... Effects of varying lipid sources as alternatives to zinc oxide or Carbadox on nursery pig growth performance and faecal consistency A total of 360 weaned pigs (DNA; 241 × 600; initially 11.9 ± 0.02 lb) were used in a 35-d study evaluating the ability of varying lipid sources to replace ZnO or carbadox in early ... Hot melt extruded-based nano zinc as an alternative to the pharmacological dose of ZnO in weanling piglets This study was conducted to investigate the effects of hot-melt extruded ZnO nano-particles (HME-ZnO) as an alternative for P-ZnO on growth performance, nutrient digestibility, Zn ...
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
9,809
is a retired Japanese female volleyball player. She was part of the Japan women's national volleyball team at the 1998 FIVB Volleyball Women's World Championship in Japan. References External links http://www.alamy.com/stock-photo-chinas-wang-lina-15-spikes-the-ball-to-score-a-point-past-blocks-by-118604875.html 1976 births Living people Japanese women's volleyball players Place of birth missing (living people) Asian Games medalists in volleyball Volleyball players at the 1998 Asian Games Medalists at the 1998 Asian Games Asian Games bronze medalists for Japan
{ "redpajama_set_name": "RedPajamaWikipedia" }
4,503
Raw materials are feedstock for the finished products and thus it becomes crucial to source these inputs only from certified vendors. So, as a reliable name, we do the same. As a trusted name, we conduct different quality tests at all stages to make sure that our offered range is defect free. We make available, Radiolucent Operating Table with Stand, which is reckoned for its durability. Packaging Details : Dimensions 59* 29* 33 = 155 Kg 32* 24* 10 = 20 kg Packed in high quality currogated box and placed in wooden tray. TYCO 8006 (ELECTRO HYDRAULIC O.T. TABLE WITH LONGITUDINAL TOP SLIDE & MANUAL OVERRIDE).Top Slide both Head & leg side : 30 cm manual operated, Minimum Height : 75 cm, Maximum Height : 110 cm, Length of Table Top : 215 cm, Head & leg Section: +90° -90° manual operated, Kidney Elevator : 15 cm manual operated. Extra low height of 25" compared to standard height of 32". Steep Reverse Trendelenberg tilt for near sit up position. Specialized attachments for easy patient positioning. Table top slide with remote control. Zero Auto Leveling. Dual Control Console. Non Hydraulic leak proof maintenance free construction. Affordable Indian prices.
{ "redpajama_set_name": "RedPajamaC4" }
2,484
This high quality beach wrap is the perfect accessory to your summer days! Take to the beach or your favorite swimming hole and use as a cover up when you get out of the water or even before! This darling wrap features the Saint Louis Cardinals emblem embroidered on the bottom left corner of the wrap!
{ "redpajama_set_name": "RedPajamaC4" }
2,522
Para-cyclists light up Bo Peep ​Eighteen of the world's best Para-cycling athletes were crowned national champions on Friday as the 2018 FedUni Road National Championships continued in Ballarat. On a warm day with little wind, Paralympic and world champions raced around the rural town of Cardigan with Rio gold medallists Carol Cooke AM (VIC) David Nicholas (QLD) doing the time trial and road race double. "I'm happy with it, it was a pretty good course," said Cooke. "I went off with the guys and the first half of the lap they were just going so fast and I just thought 'this is crazy' so I backed it off a bit. After that I just time trialled it into the finish. I'm happy it was a good ride." Cooke, who said the championships have been fabulous, took out the WT2 24 kilometre race in 48 minutes, a little over 9minutes ahead of silver medallist Gabrielle Vassallo (NSW). Paralympic gold medallist Nicholas showed his class in the MC3 event, breaking away on lap two of five to win the 60km road race by nearly three minutes to Justin Godfrey (VIC). Mitchell Bails (SA) claimed the bronze medal. "It feels awesome. It's always good to have it in the bag," he said. Nicholas was also complimentary of the program change which has seen the Para-cycling nationals raced in Ballarat with the Road National Championships. "It's very good, I think it's very good for Para-cycling so that other people can see us racing. I've enjoyed it." In the men's cycle (MC2), dual world championship silver medallist Darren Hicks (SA) took out his maiden road race national title. "I'm over the moon. It's something I've been wanting for a few years. It's been on the top of my list," said Hicks who took out the road and time trial silver medals at the world championships in Pietermaritzburg last year. "After joining the national team last year, I felt like I had done a lot of the ground work but I didn't really utilise it, so I'm really happy to come out with a win this year." Hicks finished almost 5 minutes ahead of second placed Gordon Allen and local Ballarat rider Ryan Spiteri. South Australia's Meg Lemon (SA) made amends for a missed opportunity in the time trial to take out the WC4 event, edging out Hannah MacDougall in a sprint to the line. "It feels pretty good after yesterday, I was a bit disappointed losing [the time trial] by 1 second after a couple of mechanicals [but] coming back today feels pretty good," said the 2017 world championship dual bronze medallist. In the men's H5 event, Paralympic time trial silver medallist Stuart Tripp (VIC) added the road race green and gold jersey to his time trial national title. In other events, Simone Kennedy (NSW) sprinted to gold in the WC3 event, just pipping Page Greco (VIC) to the line. Tasmanian Patrick Best out-sprinted World Championship medallist Kyle Bridgwood (QLD) for the MC4 national title, while Alistair Donohue claimed the green and gold jersey in the MC5 race. WH1: Emilie MILLER (Bathurst) MH3: Alexander WELSH (Leongatha) Men Handcycle MH4: Grant ALLEN (Port Adelaide) Men Handcycle MH5: Stuart TRIPP (St Kilda) Men Tricycle MT1: Garry ROBINSON (Camden) Women Tricycle WT2: Carol COOKE (St Kilda/VIS) Men Tricycle MT2: Stuart JONES (Newcastle) Women Cycle WC1: Kaitlyn Dawn SCHURMANN (Geelong) Men Cycle MC1: Darcy THOMPSON (Port Adelaide) Men Cycle MC2: Darren HICKS (Kilkenny) Women Cycle WC3: Simone KENNEDY (Parramatta) Men Cycle MC3: David NICHOLAS (Mackay Women Cycle WC4: Meg LEMON (Port Adelaide) Men Cycle MC4: Patrick BEST (Mersey Valley Devonport) Women Cycle WC5: Fatema TAJBHAI (St Kilda) Men Cycle MC5: Alistair DONOHOE (Blackburn) Women WB: Lindy HOU: (Vikings ACT) Men MB: Kieran MURPHY: (Norwood) For the full list of results click here FULL TIME AND RESULTS
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
8,645
(Ogden, Utah) – The Defense Department's Enterprise Computing Center (DECC) is a facility with complex environmental control system requirements. When it came time to upgrade and integrate old systems including chillers, boilers, AHU's, CRAC's, generators, UPS, and power distribution monitors, building management chose Systems Performance Engineering Corporation (SPEC), a KMC authorized installing contractor. DECC provides data availability twenty-four hours a day, three-hundred and sixty-five days a year. Therefore, in addition to planning and installing new systems, SPEC had to insure a smooth, incident-free transition during the entire construction and renovation process. Using KMC controllers along with the KMC OPC Server provided the Department of Defense with a single source, cost effective, control solution. These integrated systems could now be operated and maintained under one common front end. The Department of Defense must ensure that things are running smoothly, and KMC has been able to meet this need. According to consultants with North Glacier HVAC, an outside mechanical contractor hired to run the system, the best features of KMC's solution are the alarms and reports. When things begin to go wrong, an alarm sounds at the central computer, a report is generated, and the problem is fixed immediately. KMC systems continue monitoring the building's systems. DECC continues to expand its control system to the administrative side of the building. There are already over 2800 points, and the facility is still expanding. Systems Performance Engineering Corporation of Connecticut continues to work with DECC on this expansion, as well as providing top-notch service for the KMC products.
{ "redpajama_set_name": "RedPajamaC4" }
4,822
At the camping restaurant in a friendly atmosphere you can enjoy the best of Austrian and Carinthian cuisine. We recommend the Wiener Schnitzel (breaded cutlets ‐ pork, turkey or chicken) or the traditional Hirschbraten (sliced roasted venison in sauce). Local beer on tap is excellent, as the city of Villach prides itself on its very refreshing and very reasonably priced brews. Of course no meal is ever complete without a taste of the local desserts ‐ Strudel on top of all! ‐ and a shot (or more) of fine fruit spirits. At the restaurant you can also find our small market, where you can stock up your cupboard with all the basics. Fresh bread and croissants are available every day from 7am.
{ "redpajama_set_name": "RedPajamaC4" }
9,601
He and his group of buddies are really cool in this game. They helped me out a lot and now I can play Gears pretty well. They also go out of their way to make sure the teams are fair and even, matching skills that is. I had the most fun chilling with them on Gears than any other time. It would be cool if we could all go join up with them. Yeah, Hero I have a blast just watching him play. Next time you play chainsaw tripple as much as you can, he really hates it lol. I can't believe I'm finally having fun on here.
{ "redpajama_set_name": "RedPajamaC4" }
764
Man charged with slashing throat of 10-year-old playing video game in Lewis County LEWIS COUNTY, Tenn. (WKRN) — A 29-year-old man accused of slashing the throat of a ten-year-old boy over the weekend in Lewis County has been charged with attempted criminal homicide. The Lewis County Sheriff's Office said Sammy Sadler, Jr. used a knife to cut the child's throat while the boy was playing a video game Sunday evening at a residence on Howard Switch Road. The child was transported to Vanderbilt University Medical Center and is expected to survive his injuries, deputies explained. Court records show Sadler was booked into the Lewis County jail Monday afternoon on charges including attempted criminal homicide, aggravated assault, aggravated child abuse and resisting arrest. No additional information was immediately released and the incident remains under investigation.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
450
Delegation of Command and Staff College Quetta briefed about water, power sectors Lahore – April 3, 2017: Delegation of Command and Staff College, Quetta visited WAPDA House today and attended a briefing on the water and power sectors during their study tour. The delegation was led by senior instructor Col Hussain. Member Power Badr-ul-Munir Murtiza and senior officers of WAPDA and PEPCO were also present on the occasion. Additional Chief Engineer (Dams), WAPDA Shahid Hamid gave comprehensive briefing regarding water and hydropower sectors. He stressed that the present situation of water and power in the country calls for development of new water and hydro power projects to cope with the ever increasing demand of water and energy. He further stated that per capita water availability in the country has decreased due to rapid growth in population and depleting water storage capacity of the reservoirs because of the natural phenomenon of sedimentation. He said that Pakistan can store only 10 percent of its annual rivers flow against the world average of 40 percent. He said that WAPDA is committed to optimum development of water and hydropower resources in the country and working on a comprehensive strategy. He emphasized that Indus Cascade is Pakistan's most precious asset which will help to add a big quantum of cheap hydel generation and storages of water. He also added that an additional area of 20 MAF land can be brought under irrigation owing to construction of new reservoirs. He briefed the delegation that 2485 MW hydel electricity will be added to the National Grid with completion of three under-construction hydropower projects from end 2017 to mid 2018 to help eliminate electricity loadshedding from the country as per the Federal Government's resolve. These projects include the 969 MW-Neelum Jhelum Hydropower Project, the 1410 MW-Tarbela 4th Extension Project and the 106 MW-Golen Gol Hydropower Project. The delegation was also briefed about impact of climate change on water resources, water management challenges, water conservation and hydropower potential of the country. In another briefing, PEPCO General Manager (Revenue and Commercial Operation) Muhammad Saleem informed the delegation about power sector reforms, electricity loadshedding, energy mix, distribution and transmission losses, billing and percentage of recovery and receivables. The delegation was also apprised of the Government's efforts to overcome the electricity shortages in the country. They were also informed about new initiatives regarding generation expansion plan, system improvement, meter reading through mobile phones, automatic metering infrastructure and net metering etc. The briefing was followed by Question-Answer session. Later, Member (Power) WAPDA and delegation head exchanged the souvenirs
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
2,769
Northridge9 Filtering by: Creator Davidson, Ronald Remove constraint Creator: Davidson, Ronald Book review: city of rhetoric Davidson, Ronald Book review - City of Rhetoric: Revitalizing the Public Sphere in Metropolitan America by David Flemming Before "Surfurbia": the development of the South Bay beach cities through the 1930s Few landscapes have been more trivialized for global consumption than the southern California beach. "Baywatch," "Beach Blanket Bingo," and Rayner Banham's coinage of the term "Surfurbia" are among the myriad examples of culture products that depict t . . . Book review: the lost city of Z This is a book review of The Lost City of Z: A Tale of Deadly obsession in the Amazon by David Grann The beach versus "Blade Runner" : recasting Los Angeles' relationship to modernity This paper seeks to interpret and elaborate on the mural, as follows: In the first section I expand on the mural's narrative about Los Angeles. According to this (familiar) narrative, the city exemplifies modernist patterns of development, often to th . . . Parks, malls, and the art of war In the post-war years, Americans migrated en masse into suburbs punctuated by shopping centers that served as social and recreational hubs. Concerned about the civic wellbeing of shopping-centered suburbanites, a group called the Agora Coalition forme . . . The Los Angeles coast as a public place In the public-space discourse Los Angeles is usually portrayed as more "anti-city" than city. Its landscape is overrun by houses, "private-public" squares and plazas, theme parks, shopping malls, and so on and lacks inclusive public places. Yet this d . . . The search for authentic practice across the disciplinary divide Swain, John D. This article describes the self-reflexive investigation of four first-year faculty at a comprehensive state university in Southern California. In professional development efforts to identify best practice, each of us explored and evaluated our teachin . . . Where is the space for education? Many competing factors are now affecting how students think about higher education. One primary factor is the use of a business model for education – highlighting profit, patents, commercial investments, and the use of market competition, for example . . . Book review: What Americans build and why Book Review of What Americans Build and Why: Psychological Perspectives by Ann Sloan Devlin
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
6,730
Do Deer Move In The Rain? The Facts You Need To Check Out For A Great Hunt! →Do Deer Move In The Rain? The Facts You Need To Check Out For A Great Hunt! Sure, a lot of you have already hunted under the blazing heat, or even began targeting your game on tree stands and transporting guns to your boat, but have you ever attempted hunting during inclement weather? It seems difficult, especially when predicting the way your game moves during the rainy season. It has people wondering: Do deer move in the rain? Instead of just guessing and missing out on a good hunt just because of a bit of wet weather, read on to learn more about how deer move during the rain and how you can successfully have a good hunt despite these "limitations!" So, have you ever heard of deer moving under other types of weather besides the usual sunny day? Whitetail deer (or any other deer) do not have the luxury of sleeping under the roof of a cozy home with warmth and blankets. They live outdoors and are better adapted to the outdoor life as compared to us humans. They have the metabolism that can withstand cold temperatures that would usually freeze us, as well as the fur coats that keep them warm. During windy weathers, deer do not mind the light breeze. But once the wind becomes stronger, that is when they start to hide or avoid moving. This is because it's hard to detect or identify predators as it removes the scent and makes it difficult to hear predators and in what direction it's coming from. The same would go for rainy weather. A light drizzle is fine, or even a bit stronger than that. But once it's a torrential downpour, all your senses are unable to work. Just like us, humans, deer would have difficulty smelling, seeing, or even hearing predators approach. Because of this, they would avoid moving during the extremely rainy weather until it begins to subside. Deer have the ability to predict approaching storms, feeding lightly before and after minor storms or feeding heavily before and after the major storms. So, is it recommended to hunt deer during this type of weather? If it's slightly windy or simply drizzling, then that's totally fine. You'll still be able to find a few bucks or does roaming around the fields. But don't bother hoping for a sight of the game during inclement weather, as deer are seeking shelter or do not move at these times. Before you begin the hunt, make sure that you check the weather first. Being knowledgeable with when a storm is approaching will have you be able to determine if deer will be out or hiding during the day. To avoid getting sick or being uncomfortable during the rainy hunt, I recommend that you wear quality clothing, such as a hat, jacket, boots, and gloves, on top of your usual hunting gear, of course. You won't need to change your tactics during the light rain or drizzle. But once the moderate rain begins, attempt still-hunting or reverting pastures, as spot-and-stalk is effective. Use your eyes more so than your legs! Still-hunting is a great way to hunt during moderate rain. When doing so, always go for very slow movement, taking one step and stopping each time you do so. Spend time looking at your surroundings with binoculars for hunting. If there is nothing, take another step and begin glassing again. Keep your eye on the back trail as well, as there are some bucks that come in behind you. Stop against a tree or any type of cover (like a hunting blind). This makes it easier to blend in and has you lean on something. When it begins to rain heavily, I suggest that you wait it out rather than risk your life for the hunt. The drying out period is great, as this is when you find animals resuming their normal activities. Are there some of you who wish to achieve a great feat of hunting in just about any weather? Hunting during the rainy season may seem daunting at first, but once you realize that a bit of rain won't steer deer clear from the fields, you'll probably enjoy it more (and even like the breeze that goes with it!). I hope that this article answers your question: "Do deer move in the rain?" Now that you know how your game goes around the fields as you hunt, begin planning your next expedition today! If you have any questions or would like to share your tips and experiences with hunting in the rain, then comment down below. I would love to hear what you have to think.
{ "redpajama_set_name": "RedPajamaC4" }
330
I always recommend to allow at least an hour for transportation from anywhere in Walt Disney World. I always prefer to be a few minutes early for my reservations and allowing an hour typically gives us enough time to get to our destination.Leaving Disney's Hollywood Studios theme park around 4:00pm should give you enough time to make it to your reservation at O'hana at 5:05. I would make sure you're at the bus area at 4:00 just in case you need to wait for a few minutes for your bus to arrive. Your dinner at O'hana should last around 60-90 minutes. For the week of September 24th, DHS closes at 8pm. You may have time to return to DHS for an hour or so, depending on your meal length as well as the bus timing back to the park. Please come back and see us again real soon, Pam.
{ "redpajama_set_name": "RedPajamaC4" }
2,029
"There is no complaint department, Robert." "Find out which is cheaper, Hoyt - improving our products or expanding our complaint department." "You think that I'm just here to keep the chair warm? Sir, nothing could be further from the truth!" "Who do I complain to about your complaint department?" "...and another thing, I want to complain that you're a mannequin!" "No, madam. You're right. This is the complaints desk."
{ "redpajama_set_name": "RedPajamaC4" }
8,143
In a previous report here on OscarMini we told you that Xiaomi was warming up to launch its Xiaomi Redmi 3, little did we know that it was even coming sooner than we expected. Well the latest on that is Xiaomi has officially launched its Redmi 3 smartphone at a price of CNY 699 (approximately N21.000). The major highlights of the latest Xiaomi Redmi 3 is its 4100mAh non removable battery. With a battery density of 685Wh/L, which extends the talk time of its predecessor by reportedly 80 percent, the Redmi 3 comes with support for 5V/2A fast charging as well. In terms of design the new Xiaomi Redmi 3 boasts of a metal body along with a textured rear panel. It will be available in gold, grey and silver color variants. The Xiaomi Redmi 3 packs plenty of power and speed to meet your needs. the smartphone features a 5-inch display with a 720 x 1280 pixel resolution. It is powered by a 1.2GHz octa-core Qualcomm Snapdragon 616 processor, paired with 2GB RAM. It sports an internal storage of 16GB which can be further expanded up to 128GB via microSD card. Xiaomi Redmi 3 runs on an unspecified Android version with MIUI 7. In addition to this, the new Xiaomi Redmi 3 also sports a 13MP rear camera with LED flash along with a 5MP front facing camera. The selfie camera also supports full-HD recording. The dual SIM (GSM and GSM) smartphone that accepts Micro-SIM and Nano-SIM also includes connectivity features such as Bluetooth, GPS, A-GPS, Glonass, GPRS/EDGE, Micro-USB, and Wi-Fi 802.11 b/g/n. The Xiaomi Redmi 3 is not yet available in Nigeria. However, the device is available on Mi.com and Tmall and will go on sale in China starting tomorrow. When available in Nigeria it will sell for around N21.000.
{ "redpajama_set_name": "RedPajamaC4" }
4,147
This short film recreates the experience of Sylvie, a battered woman who seeks shelter in a Montréal transition house. Faced with the threat of violence, loneliness, the lack of financial resources or information about services, the victim is often understandably reluctant to seek help. Emphasizing the importance for women of speaking out, the film also points out the role of the transition house in putting victims of abuse in touch with appropriate legal and social services. Sylvie's Story is part of The Next Step, a 3-film series about the services needed by and available to battered women.
{ "redpajama_set_name": "RedPajamaC4" }
8,168
HomeUncategorizedRoadside safety messages increase crashes by distracting drivers Roadside safety messages increase crashes by distracting drivers January 9, 2023 The Conversation Uncategorized Behavioural interventions involve gently suggesting that people reconsider or change specific undesirable behaviours. They are a low-cost, easy-to-implement and increasingly common tool used by policymakers to encourage socially desirable behaviours. Examples of behavioural interventions include telling people how their electricity usage compares to their neighbours or sending text messages reminding people to pay fines. Many of these interventions are expressly designed to "seize people's attention" at a time when they can take the desired action. Unfortunately, seizing people's attention can crowd out other, more important considerations, and cause even a simple intervention to backfire with costly individual and social consequences. One such behavioural intervention struck us as odd: Several U.S. states display year-to-date fatality statistics (number of deaths) on roadside dynamic message signs (DMSs). The hope is that these sobering messages will reduce traffic crashes, a leading cause of death of five- to 29-year-olds worldwide. Perhaps because of its low cost and ease of implementation, at least 28 U.S. states have displayed fatality statistics at least once since 2012. We estimate that approximately 90 million drivers have been exposed to such messages. Startling results As academic researchers with backgrounds in information disclosure and transportation policy, we teamed up to investigate and quantify the effects of these messages. What we found startled us. Contrary to policymakers' expectations (and ours), we found that displaying fatality messages increases the number of crashes. We studied the use of these fatality messages in Texas. The state provides a useful laboratory to study such messages, as it has 880 DMSs, 17 million drivers and, unfortunately, typically over 3,000 road-related fatalities per year. The most advantageous aspect of this sample, however, is that from August 2012 until the end of our sample in 2017, the Texas Department of Transportation only showed these fatality messages for one week each month — the week before the Texas Transportation Commission's monthly meeting. This institutional feature allowed us to compare, for instance, the hourly number of crashes occurring around a DMS during the week when fatality messages are being shown, relative to crashes on the same road segment during the other three weeks of the same month. Also, we were able to control for time of day, day of week, weather conditions and holidays. We found that there were two to three per cent more crashes within one to 10 kilometres downstream of each DMS during the week fatality messages were shown. This suggests that this specific behavioural intervention backfired in Texas. Warning distractions We conducted two tests to rule out whether this finding was simply because these weeks happen to be inherently more dangerous. First, we looked upstream of each DMS. In doing so, we limited our sample to those DMSs without another DMS within 10 kilometres upstream. We found no increase in accidents upstream of these DMSs, but still find an effect downstream. Second, we investigated whether the weeks before the monthly meetings of the Texas Transportation Commission had more crashes in the months before Texas began showing these fatality messages. Looking at data between January 2010 and July 2012, we found no evidence of a change in crashes during the week prior to the Texas Transportation Commission meeting. Based on our findings, we hypothesized that these fatality messages cause more crashes because they make drivers anxious and distract them. Our research found several pieces of evidence that supported this hypothesis. First, we found that the larger the displayed number of fatalities (a plausibly more shocking and distracting message), the greater the increase in crashes. Higher fatality counts are associated with significantly more crashes, whereas lower fatality counts are associated with fewer crashes. Related, fatality messages cause the largest increase in crashes in January, when the display shows the prior year's total in Texas. Conversely, there are marginally fewer crashes in February, when the fatality count resets and is at its lowest. Second, the increase in crashes is concentrated in more complex road segments, where focusing on the road is likely more important and the cost of a distraction more severe. We also found that crashes increased statewide during the weeks when messages were displayed, inconsistent with improved driving farther away from DMSs; that the days after a campaign end are no safer than other days; and that these messages continue to affect drivers after more than five years of showing fatality statistics. Counterproductive to safety Our research shows that displaying fatality messages does not result in safer driving and fewer crashes. Besides the more obvious takeaway that displaying fatality messages may be counterproductive, our findings highlight two broader issues. First, while behavioural interventions should grab attention, this can be taken too far and have costly consequences. Second, it is vital to evaluate policies and their outcomes over time, as even good intentions may not necessarily lead to the desired outcomes. By: Jonathan Hall Assistant Professor, Economics, University of Toronto By: Joshua Madsen Assistant Professor, Accounting, University of Minnesota Jonathan Hall has received funding from the Social Sciences and Humanities Research Council of Canada, the European Union's Horizon 2020 research and innovation programme, the NET Institute, and via an interagency agreement between the US Department of Transportation and the National Science Foundation. Joshua Madsen does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. This post was originally published at The Conversation. What you need to know for your next hybrid or electric vehicle purchase China moves to regulate deepfake technology
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
958
The yellow pages of blockchain has arrived: Networks are now visible to the world In the 1830s the first Yellow Pages emerged as an organized directory of businesses. This had a profound effect on commerce and is an often-overlooked key contribution to the Industrial Revolution. Simply put, the Yellow Pages put businesses on the map and made them known to the world. Without it, a business was hidden and only visible to those who knew about it via invitation. Similarly, in the late 1990s ICANN became the coordinator of several directories including the Domain Name System (DNS) that is central to launching the internet and the world wide web, by uniquely listing businesses names and address, thereby transforming them to publicly addressable and digitally interactable entities. We think blockchain is the third major wave of network technology (phone, internet, blockchain). But we were missing something, up until now. Today, I am proud and excited to announce that we have joined the HACERA Unbounded Registry as a founding member. As the number of blockchain consortiums, networks and applications continues to grow we need a means to list them and make them known to the world, in order to unleash the power of blockchain. The Unbounded Registry, built on blockchain technology, provides a decentralized means to register, look up, join and transact across a variety of blockchain solutions, built to interoperate with all of today's popular distributed ledger technologies, including The Linux Foundation's Hyperledger Fabric, Hyperledger Sawtooth, EEA Quorum, Stellar Network and more. What makes a private network private, or a public network public? Well, most private networks are unlisted. Unbounded gives these network solutions a change to be in the "yellow pages of blockchain," and be publicly visible. As a founding member of several consortiums and a technology provider for others, we are frequently asked questions such as, "As a supplier, how can I join the TradeLens trade digitization blockchain," and "What capabilities does LedgerConnect provide?". . Also, "Is the True Tickets network running, and how do I list a new event?" — by the way… the answer is yes, it's running. With Unbounded, a solution user is one step closer to answering their questions themselves. Since IBM began our blockchain journey, we have been committed to the development and use of open technologies. We realized from the start that you cannot do blockchain on your own; you need a vibrant community and ecosystem of like-minded innovators who share the vision of helping to transform the way companies conduct business in the global economy. We initially joined Hyperledger, and contributed the initial code for Fabric, to collaborate on blockchain for business, then the Sovrin Foundation for decentralized identity and we have also participated in the Stellar network for global payments. IBM and HACERA have been working closely on blockchain technology for a while. HACERA was one of the first members of the IBM Blockchain ecosystem — going back to 2016 — including the HACERA and IBM release-managed Hyperledger Fabric 1.0 a year ago. We see HACERA's Unbounded Registry providing several key capabilities that addressed these common problems: Reserved naming for networks, applications and consortiums. The discoverability of blockchain networks and applications. A catalogue of domain-specific functions and services. An independent, open and shared blockchain backed platform to help us all with bootstrapping, launching and growing our communities. By working with HACERA and other members of Unbounded Registry, we see a future where consumers and providers of blockchain services will be able to discover each other and begin transacting in a more secure way and where technology providers and consumers can innovate and integrate to create limitless and unbounded possibilities. Cross-posted here Explore the HACERA Unbounded Registry Now
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
4,622
This Thursday – Make the Call for Home and Community-Based Long-Term Care! Health reform will begin moving through two key House committees this week. Your Representative serves on one of these committees. We need your help this Thursday the 16th to advocate for including home and community-based long-term care in the bill. Thousands of seniors are impoverished and forced into nursing homes prematurely because they lack of coverage for these services. If Congress does not address this problem now, it may be decades before this opportunity arises again. Urge your Representative to include the Community Living Assistance Services and Supports (CLASS) Act, H.R. 1721, in the House health reform bill. I'm a constituent calling to urge Representative X to support including the CLASS Act in the House health reform bill. The CLASS Act would create a voluntary insurance program to help those in need receive home and community-based long-term care. Too many seniors and their families are impoverished and forced into nursing homes prematurely because they lack of coverage for these services. The CLASS Act will help address this major unmet care need. The CLASS Act is supported by President Obama and over 100 national groups, and would reduce the deficit by $58 billion, including over $4 billion in state and federal Medicaid savings. Thank you for adding your voice to the health reform debate!
{ "redpajama_set_name": "RedPajamaC4" }
4,067
Fairfield University's Master of Science in Software Engineering (MSSE) focuses on the skills in design, analysis, implementation, testing, and validation you need to develop sophisticated and successful software projects. Intended for working professionals, such as web designers and developers, and database administrators, the coursework is directly applicable to workplace tasks. Conversely, the professional experiences and perspectives students bring to the classroom enrich the course content, providing an effective learning environment for all involved. Software engineering is a disciplined approach of developing, operating, and maintaining the software system. Software engineering has been ranked first among the "Top 10 Best Jobs" by Money magazine and tagged as the fastest-growing occupation in the upcoming decades by the U.S. Department of Labor. Fairfield's M.S. in Software Engineering program provides an up-to-date curriculum that matches current job markets and industry needs. As a student enrolled in the program, you'll build expertise in your special interest area as well as gain a general understanding of various software engineering fields. You will develop excellent technical skills, strong analytical skills, and an appreciation of true quality in the work of software engineers. Our dedicated faculty with strong academic and industrial experience will spend extra time with you in and out of the class room to support you in your academic and personal growth as a professional software engineer. They will guide you to hone technical skills, as well as communication and leadership skills, through hands-on projects and capstone courses. In addition, you'll have ample opportunities for industry experience through internships and industrial projects. Interested? Please feel free to explore the possibilities of advanced study in software engineering at Fairfield University by e-mailing me (rusu@fairfield.edu) to either set up an appointment or to clarify any questions you may have about the Master of Science in Software Engineering. See Software Engineering course catalog for more information. A five-year program is offered in Software Engineering at Fairfield's School of Engineering, leading to a Bachelor of Science and Master of Science dual degree. This program embraces the educational objectives of the undergraduate program in Computer Science (accreditation track), as well as those of the graduate program in Software Engineering. It emphasizes experiential learning in terms of industrial internships following the sophomore year, and a final capstone project that guides students through a process of design and innovation at the level of a professional engineer. Graduates of the program master the knowledge and tools they need to create the next generation of software solutions to ever more complex technological and societal problems. Applicants for the M.S. in Software Engineering must hold a bachelor's degree from a regionally accredited college or university (or the international equivalent) and demonstrate adequate experience as a professional software developer or programmer, whose academic and professional record suggest the likelihood of success in a demanding graduate program. Applicants with an undergraduate degree in an area other than software engineering, computer science, or the equivalent, may need to take bridge courses to develop the required background for the program.
{ "redpajama_set_name": "RedPajamaC4" }
9,822
THE WINTER SALE | 50% OFF | PRICES AS MARKED | SHOP WOMENS SHOP MENS SEE DETAILS B-Low The Belt Milla Belt in Black and Gold Style Number: BT1640 23 22 1/4" 33 1/2" 19 3/8" 24 23 1/4" 34 1/4" 20" 33 32.25" 43.25" 25.5" 34 33.25" 44.25" 26" XS 32"-33" 25"-26" 35"-36" S 33"-34" 26"-27" 36"-37" M 34"-35" 27"-28" 37"-38" L 35"-36" 28"-29" 38"-39" XL 36"-37" 29"-30" 39"-40" Measure around the narrowest part of your torso. With your feet together measure around the fullest part of your hips/rear. With legs slightly apart, measure around the fullest part of your thigh. 28 28 1/2" 34" 21" 29 29 1/2" 35" 21 1/2" XS 14 3/4" 36" 28"-29" S 15 1/4" 37"-38" 30"-31" M 16" 39"-40" 32"-33" L 16 1/2" 41"-42" 34"-35" XL 17 1/2" 43"-44" 36"-35" XXL 18" 46" 38"-39" With legs slightly, measure around the fullest part of your thigh. 2T 18M-27yrs 35" 20 1/2" 20" 3T 2-3yrs 38" 21" 20 1/2" 4T 3-4yrs 41" 22" 21" 4 3-4yrs 41" 22" 21" 5 4-5yrs 43.5" 23" 21.5" 6x 6-7yrs 48.5" 25" 22.5" 7 7yrs 51" 26" 22" 10 8-9yrs 55" 28.5" 24" 12 9-10yrs 58" 30" 25" 14 10-11yrs 61" 31.5" 25" 2T 21" 22" 14" 6 25" 26" 17 1/4" 6X 26" 27" 18 1/2" S(7/8) 27 1/2" 28 1/2" 19 1/4" M(10) 29 1/2" 30 1/2" 20" L(12) 31 1/2" 32 1/2" 21 1/4" XL(14) 33 1/2" 34 1/2" 22 1/2" 12M 9-12M 30.5" 19" 19" 18M 12-18M 32" 19 3/4" 19 1/2" 24M 18M-2yrs 35" 20 1/2" 20" 8 7-8yrs 53" 26.5" 24" 10 9-10yrs 56" 27.5" 25" 14 11-12yrs 61" 30" 27" S(8/10) 32" 32" 19 1/4" M(12) 34" 34" 20" L(14) 36" 36" 21 1/4" XL(16) 38" 38" 22 1/2" Items returned within 30 days of delivery are eligible for a refund. Returns must be unwashed, unworn, unused with original tags attached. A free prepaid return label will be included in all shipments within the contiguous United States. Holiday Extended Return Policy: Items purchased on or after 11/01/2020, are eligible for return until 1/31/2021 We really appreciate your feedback here at 7forallmankind! We value your input. It may take 24 hours for your review to appear. sign up to receive first access to sales and new arrivals + enjoy 15% off your first purchase* *Cannot be combined with other offers Change RegionTerms & ConditionsPrivacy PolicyCookie Policy © 7 For All Mankind. All rights reserved.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
6,727
Hurricane to Hell Posted by ImNoSaint on August 9, 2014 July 8, 2018 One of the many great things technology has done for mankind is diminish the frequency of the turn of phrase, there's only one way to find out, the very words uttered (or so we imagine), when Lorrel "Sixty" McInnely put his bulldozer in gear towing a compressor across two felled lodgepole pines that spanned the gap of Box-Death Hollow fifteen hundred feet below. I can also imagine that "Sixty" got his name when it was discovered that one of his testes weighed thirty pounds. But, I'll come back to the rest of the story in a minute. We've gone all Summer without a true breaking out into the backcountry save for a day trip late July, so on the last weekend of the season we set out to drive and camp along the road that goes to Hell's Backbone on the Grand Staircase National Monument. We packed up Friday evening and left Hurricane early Saturday morning, getting into Zion Canyon when the sun was up above the east rim. If you've never made the Utah Highway 9 traverse of the canyon, you should know that it is by no means a shortcut to Highway 89, but it is always well worth the time it takes. One of the slowing influences is the obligatory traffic control turning the two-way flow through the Zion tunnel into one-way whenever an RV over a certain size presents itself, much like the one above. The tunnel is a mile long and was engineered at a time when caravans this size were unimaginable, at least in making their way up to the tunnel to begin with. Highway 9 opens up on the east side and the eyes are always treated to the awe that is the geology of this area. We stopped for breakfast fare at a new attraction along Highway 9 just north of Orderville at Forscher German Bakery. We went in to find a beautiful assortment of authentic pastries, but oddly absent were the odors one would associate with early morning baking. I had a currant strudel while the wiser Mindy had a warm roll. The strudel was good, but was not fresh as one might expect from a bakery. But, this is a backcountry report, so who cares. We decided to retrace our earlier route along the Skutumpah Road, this time going all the way to where the road intersects the paved road that connects Henrieville and Kodachrome National Park. Just as Skutumpah drops down into the Kodachrome Basin travelers are treated to this view, one of dozens along the way. Fathers' Day a year ago my kids and I did a camping trip through here, but Mindy had never been so we made the little tour. Chimney Rock is a must-see in Kodachrome Basin, a formation that gets everyone scratching their head. Another attraction in the campground, not pictured here, is a rock formation that gets everyone shaking their head; think Lorrel "Sixty" McInnely. And yet, it appears to be this state park's eight thousand ton guerrilla with no mention of the formation's name, though the park is known among locals as Phallus Park. From Kodachrome, coined by the National Geographic Society in 1949 after the type of film it used (a name conferred only after permission from Rochester, New York), we joined the All American Highway that is 12 at Henrieville and drove pavement to Escalante where we had Calzones at Escalante Outfitters, the best folded pizzas west of the Mississippi. We picked up 300 East on the north side of Escalante (if you've hit the Hole-in-the-Rock trail you've gone too far) and headed north along the paved road. Three point three miles in, the road turns to graded gravel with much washboarding. Airing down enough to smooth things out a bit is recommended. Hell's Backbone Road starts at an elevation of 5,700 feet and summits at 9,200. The road is passable for passenger cars and appears to be well maintained having passed a grader along the way. There are trails rating in the sixes such as McGath Lake Trail that feed from this road that provide greater challenges. Given our passengers, Ginger and Maryann, we spared them the jostle and stomach turning, having learned this the hard way on the Barracks Trail. About half way through Hell's Backbone we turned off the trail and headed up to the Blue Spruce Campground, seven sites, six of which are bordered by the clear Pine Creek. It was mid-afternoon, and the mossy green forest floor and the small falls of the creek bid us abandon of any further driving in favor of relaxing instead. Not sure if it was karma, the odds or just dumb luck, or all three, but we had the entire campground all to ourselves. We wondered a bit if all the beware-of-bear warnings were the reason for this ghost campground, but that didn't deter us, bear spray handy. If you follow the exploits of Ginger and Maryann, you know they were right at home here. Plus squirrels. Dog heaven. Makes me wonder why we as humans are so hard to please. As the shadows grew longer I attempted a fire to create some coals for our foil-wrapped veggies, but the area fuel was too wet, so we cooked them on the Volcano along chicken breasts. Cous cous rounded out the fare of a delicious evening meal. By the way, if you're not familiar with the Moroccan pasta, cous cous is a great camp meal starch in its ease of preparation and mild taste that blends wonderfully with herbs and fungi. Just don't eat it raw and drink a lot of beer. The Belgians tortured Algerians this way. We were serenaded to sleep by babble and flow of Pine Creek. Sunday morning we broke camp after instant oatmeal – a great and easy hot meal to prepare, enjoy and clean up – and coffee and grapefruit juice, and joined FR 153 to Hell's Backbone Bridge. About four miles from Blue Spruce the road's view opens up to the Box-Death Hollow. With a little Geology 101 we can guess at the effects of wind and erosion on the Grand Staircase, but whatever forces created this tree's gnarl escape the imagination. Another 1.8 miles and we crossed the great Hell's Backbone Bridge 3.0. Yes, this is the third bridge. Version 1.0 was the one traversed by Sixty on his bulldozer pulling behind him a trailered compressor in 1933. The Great Depression's relief of the Civilian Conservation Corps were the ones responsible for the double-pine-trunk crossing under the tracks of Sixty's dozer. He inched across, having tied a rope around his waist in the event that the logs should fail, trusting in whatever method used to secure its other end. He didn't need it, though, since he and his heavy gear made is across without incident and went on to plow and build the road we traveled. In the photo above in the shade of the bridge are what appear to be version 1.1, a number of long pines that seem to span the gap. A second bridge of more concrete engineering was erected in 1940 and was then replaced by the current version. Box-Death Hollow from Hell's Backbone. Box-Death? What's that about, anyway? It's about cattle plunging to their death trying to cross Hell's Backbone. That's why we kept the dogs in the H3. On the Boulder Mountain side of the backbone the trail descends back to Highway 12 just south of Boulder. Back on pavement we were treated to our favorite part of our favorite roads in Utah, Hogsback Ridge through Calf Creek Canyon and up the slick rock of Escalante Basin. This area is hemmed by the Colorado Plateau of which the basin is a part, along with the Aquarius Plateau to the north, the Circle Cliffs to the east and the Kaiparowits Plateau to the west. We stopped for lunch on the quarter-mile stretch of the Hogsback Ridge. The afternoon began driving back on Highway 12 to Henrieville where we doubled back just short of Kodachrome State Park and mounted the Cottonwood Canyon Road, a little over 37 miles long that terminates south at Highway 89 between Paige and Kanab. The shot above is from a vista reached shortly after the Butler Valley Draw, looking north. Below is panned east and looks back upon Powell Point and Kodachrome Basin. Makes you think all the world's a sunny day, whoa yeah. Looking south. Eleven miles into the Cottonwood Canyon Road is the geological anomaly (what isn't) of Grosvenor Arch. This is a rare double arch named after Gilbert Hover Grosvener, a one-time president of the National Geographic Society. The Cottonwood Canyon Road continues along the Cottonwood Wash filled with, you guessed it, Cottonwoods, making several shady and tucked-away primitive camp areas. While this is a graded road, it is not maintained as well as the Hell's Backbone Trail. We've had recent monsoons that have wreaked a bit of havoc in the washes, but otherwise the road is pretty passable. Every vehicle we passed was a rental car of crossover DNA. Further south is the amazing Cockscomb, a long geological feature of a serrated ridge that with enough imagination might have one think a dragon is buried alongside the Paria river. It's a boundary of two distinct geological worlds smashed up against each other and receded. The road crosses the Cockscomb twice. Once through the Cockscomb we entered the Rock House Cove where the H3 felt as if it turned into a Mars Rover. And then, into our parting shot of the badlands. What a great trip, and all within a couple of hours' drive from our home. Sixty is my new hero, and even if he may have never uttered the words, there's only one way to find out, this has always been the impetus behind putting it into gear, opening the throttle and going. Previous Post Skutumpah Next Post October RiSE
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
4,891
Vol. 10, Issue 278 - Wednesday, Oct. 5, 2005 W | T | F | S | S | M | T StarHeadlines UH FOOTBALL Hawaii's defense grows a nose It took a few weeks, but UH has found a couple of big guys it can count on By Dave Reardon dreardon@starbulletin.com Ask Jerry Glanville about the importance of the nose tackle position and he gives you the name of a cornerback. As it often does in his case, what at first doesn't seem to make sense does with just a little more information. "Cris Dishman," the Hawaii defensive coordinator said, naming one of his old Houston Oilers. Then, after a pause: "A great cornerback. He always wanted to know who the nose was. 'You got a nose?' " The Warriors have about five or six. Four games into the season, it looks like they've narrowed it down to two with a clear starter in sophomore Mike Lafaele and a solid No. 2 in freshman Keala Watson after the defense played well again in Saturday's loss to Boise State. Back to Glanville's point. Dishman wanted to know who the nose tackle was because if he was a good one, he could help put pressure on the quarterback and Dishman wouldn't have to try to cover his man for too long. Every player depends on every other player in a defense, but in the 3-4, none is more important than the nose tackle, and it's a chain reaction all the way to the cornerbacks. "It's always been important in this scheme. You gotta have a nose that is a team player and powerful and tough," UH coach June Jones said. "You just gotta have that guy in there and we've got two pretty good ones now." Lafaele, the converted offensive lineman from Farrington, has likely earned a second consecutive start. Tony Akpan, Reagan Mauia and Watson started the first three games, and the preseason favorite to do so was injured Renolds Fruean. The auditions are over for now as the Warriors (1-3, 1-1 WAC) get ready for a game at Louisiana Tech (1-2, 1-0) on Saturday. "I would say he's playing nose very well," Jones said of Lafaele. "He'll start probably again this week." If the nose tackle and the other defensive linemen to either side of him (Melila Purcell and Ikaika Alama-Francis) can occupy a large number of offensive linemen, the other defenders have a better chance of outnumbering the offense at the point of attack. "They free up the linebackers," Alama-Francis said. "The center is the key. He has to play the center really hard, really good, and that opens it up for the middle linebackers to make plays inside. We did a really good job against Boise State. Michael did an awesome job, Keala did an awesome job. I'm excited about playing next to them." Lafaele said he was triple teamed often Saturday. And he loved it, because it meant he was doing his job. Here's how he described his tasks: "Hold the line of scrimmage, play that center, don't let him get out to the linebackers, blow up the line, cancel out gaps. Try to make plays," Lafaele said. The 6-foot-1, 297-pound Lafaele said his year with the UH offensive linemen serves him well now, because he can anticipate their moves based on what he used to do. "I know for O-linemen, they pick up blitz first. There's a lot of stuff that carries over to the other side. A lot of technique and alignment and stuff like that. That really helps me a lot, knowing the different kind of blocking they like to run." Lafaele has been credited with just two tackles, both assists. But everyone associated with the Warriors defense know that stat means next to nothing for the nose tackle. "Everybody has a job. It just makes everybody's job much easier with a nose who can do the right things in this scheme," Alama-Francis said. "They play hard. And that's what Coach Glanville is looking for. On his defense you need a nose. Mike and Keala are outstanding there." Keomaka might play: Reserve cornerback Ryan Keomaka, listed as questionable for Saturday's game, might play because his shoulder injury suffered against Boise State isn't as bad as it might have been. "It's subluxed and he'll have to play with pain and get it fixed at the end of the year," Jones said. Kenny Patton and Keao Monteilh are the starting corners, with Turmarian Moreland and Keomaka listed as the backups. By way of Texas: The team flies to Houston tonight and practices there tomorrow and Friday before heading to Monroe, La., Friday afternoon. E-mail to Sports » Mayor sees potential in train like Japan's » Wie turns pro » Akaka views damage to Gulf Coast » ACLU urges protection of gay youths » Fence around Hilo jail urged » 2 pilots dodged storm » Judge is disbarred » Maui Hui leader sentenced » UH leader retiring » Piano teacher taught for 50 years » Comments sought on rare flies » Honeycreepers surviving disease » Rice from the ground up » Stockbroker's demise » Electric Kitchen » Hawaii's Kitchen » Katrina benefits » Key Ingredient » Lakers find peace in Hawaii » Hawaii's defense grows a nose » Castle's Kalauokaaea bounces back » Notebook: Basketball » Prep Beat: HS Athletics » Triathlon events to give tourism a boost » Home sales show signs of cooling » Fine right motorists for violations Kokua Line June Watanabe Reel News Betty Shimabukuro Cynthia Oi
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
9,808
MoviePass, the almost all-you-can-watch buffet of movies on the big screen, is a pretty sweet deal at $9.95 a month. But what if I told you that deal could be even sweeter? We're talking scoring free popcorn, guaranteeing tickets to brand new releases, and easily avoiding the most common problems. All you need is the right tricks. For most showings, MoviePass requires you to physically be at the theater in order for you to purchase a ticket (it checks your location via GPS through the app). Showing up to a primetime showing of that super popular movie in hopes of buying a ticket right then and there is not a good idea. If you want to use MoviePass for stuff like that, go to the theater the morning of the showing and buy your ticket then. If you can't get to the theater earlier in the day to lock down your ticket, Peter Allen Clark at Mashable suggests you give the "Fandango swap" a try (my silly name, not his). For movies with reserved seating, buy a ticket in advance with the Fandango app. Once you get to the theater, cancel your ticket in the app (this needs to be done before the movie starts), then quickly try to buy that now-open seat with MoviePass. Clark admits this is a huge risk since so many things could go wrong, but if you're crazy busy and desperate to see that super hero thing opening night, it might be your only option. Just tell them what's going on and they'll quickly check you in and let you buy your ticket. Since we're talking about data feeds and how they can go wrong, it's a good idea to always check showtimes outside of the MoviePass app. What you see in the MoviePass app may not always be accurate, so see if the listings are the same on the theater's website before you head out the door. You get 100 points for every dollar you spend, so I get roughly 1500 points (tickets are about $15 where I live) every time I see a movie, but you also get bonus points based on how many movies you see in a year. If you see 6 movies in a year (which I've done in less than a month w/ MP), you get an extra 250 points every visit. 10 movies gets you "Ruby" status, and 500 bonus points. 20 visits gets you 1000 bonus points. At that point I'm getting 2500 points every visit, which means I can get free popcorn/soda every third visit, or a free ticket (for your MoviePass-less companion) every 6 visits. That's a huge bonus to an already great movie seeing deal! If you've forgotten your MoviePass card, see if you can buy an e-ticket through the MoviePass app. Not all theaters support e-ticket purchases—MJR, Studio Movie Grill, B&B and Goodrich Quality Theaters do for sure—but it's worth checking. If that's not going to work, reach out to customer service through the app (same instructions as above in the "manual check in" section). They might tell you to purchase your ticket out of pocket and reimburse you later. Just be sure to take a photo of your movie stub and receipt to send to them later. Important: do not expect reimbursement unless you were instructed to purchase a ticket out of pocket by a MoviePass associate. Always reach out to them first. In the event your MoviePass card is denied by a theater employee, declined at a kiosk, or there's some other error processing the payment, reach out to customer service through the app. They'll find a way to get you in—assuming you're at a supporting theater—even if it means reimbursing you after you pay for the ticket out of pocket. Previously, MoviePass didn't allow some users to see the same movie over and over again. Their terms of service has since changed to allow multiple viewings of the same movie, but just in case you ever encounter a road block, there's a way around it worth knowing. YouTube channel The Reel Talk suggests you simply check in for a different movie with a similar showtime and ticket price, then go ahead and buy a ticket for the movie you're trying to see again. I recently tested this myself and was able to see Coco just fine. Again, it was MoviePass that made it so their service does not work at these locations, not AMC. This situation does not mean that AMC is going to block MoviePass completely, nor can they without breaking terms with MasterCard. That said, some theaters are going to frown upon you using MoviePass to visit their theaters more frequently and have more money for snacks for some reason. When you can, use an automated kiosk instead of an in-person box office to sidestep any issues. Buying multiple budget/matinee movie tickets on the same day: The theater might only charge $6 a ticket on Tuesday afternoons (which means you might be able to buy two tickets at once), but it's against the rules even if it is a steal. Buying food or snacks with your MoviePass: Some people have had success checking in to a movie theater and purchasing $15 worth of snacks then leaving (so a medium popcorn), but you'll get charged a fee if they catch you doing this! Just enjoy the fact you can see a movie for cheap and buy some snacks from the concession stand. Those sales are what keeps theaters in business. Scalping movie tickets you bought with MoviePass: You shouldn't scalp tickets, whether you use MoviePass to buy the tickets or not, so just... just don't. Come on. If you're caught doing any of these things, you might be charged a fee, but MoviePass might straight up cancel your account and ban you from signing up again. MoviePass is an awesome power—don't abuse it.
{ "redpajama_set_name": "RedPajamaC4" }
5,130
Patrick Nixdorf is a System Engineer for Garland Technology in the Frankfurt, Germany office, providing pre and post-sales support, as well as solution design for EMEA and APAC. He has over 10 years of experience in the networking industry, primarily in the TAP business. For all the concerns over Industrial Ethernet and IIoT security in recent years, there aren't many identified malware families poised to take down critical infrastructure. In fact, since the 2010 Stuxnet attack in Iran, there have only been a handful of unique malware kits in the industrial sector. However, security researchers recently uncovered the latest malware targeting critical infrastructure—TRITON. In our recent webinar with Palo Alto's Fuel User Group on network connectivity, we went over two main topics, one on Cabling and Connectors, and the other on 100M Copper networks in Industrial Ethernet environments. Businesses are in a constant battle to balance the benefits of new technology with the risks of increasingly sophisticated data breaches. But things are a bit different for those responsible for critical infrastructure. The pressure to make everything "smart" keeps heating up. However, adopting the Industrial Internet of Things (IIoT) won't do you much good if "smarter" control machinery doesn't prove reliable.
{ "redpajama_set_name": "RedPajamaC4" }
7,535
L.A. begins crackdown on Hollywood Hills 'party houses' The L.A. city attorney's office is going after two homes reportedly known for raucous parties. (Sept. 21, 2017) By Joseph SernaStaff Writer It's one of the worst-kept secrets of living in the Hollywood Hills. Many of those raucous parties hosted above the glittering lights in those midcentury boxes of glass and beams are not hosted by people who actually live in the upscale canyons and hillsides. Rather, homeowners and property managers rent out their homes to short-term guests who throw events flowing with booze, loud music and hundreds of attendees. Residents have been complaining about the problem for years. But critics say the parties have gotten worse in recent years — especially with the rise of short-term rentals. There have been angry public hearings, neighbors collecting evidence against out-of-control partiers and calls for tougher laws. Now City Hall is getting involved. The Los Angeles city attorney's office is now looking to turn down the volume on two Hollywood Hills homes reportedly known for loud, late-night parties. A motorist drives along La Cuesta Drive in the Hollywood Hills, past a home that was cited by the L.A. City Attorney for holding loud, raucous parties. (Mel Melcon / Los Angeles Times) In a pair of criminal complaints filed Tuesday, City Atty. Mike Feuer has charged the owner of a single-family home in the 7800 block of Electra Drive and the property manager of a home in the 2600 block of La Cuesta Drive with multiple misdemeanors related to maintaining a public nuisance. "It's completely unacceptable for residential homes to be rented out every few weeks for massive parties that attract hundreds of guests, blast music throughout the night and block streets, disrupting peace and quiet in our neighborhoods," Feuer said in prepared statement. Perhaps the most high-profile party-palace landlord was Danny Fitzgerald, the owner of four properties on Weidlake Drive in the Dell neighborhood near the Hollywood Reservoir. The address hosted numerous bacchanals that were publicized on social media from 2012 to 2015. Neighbors reported seeing a caged lion at one party and an elephant at another. The biggest of the homes was used to film a Playboy TV show about swingers. The partying at those properties eventually quieted down after both the city attorney and building and safety division sent Fitzgerald notices in the summer of 2015, officials said. The Los Angeles city attorney has filed charges against the owner and manager of two Hollywood Hills properties known for being locales for partying. Pictured above is a home on Weidlake Drive that drew similar scrutiny in 2015. The latest offenders, however, have ignored similar notices, officials say. In May 2016, Feuer's office sent the owner of the home on Electra Drive, Kamran Younai, 46, a cease-and-desist letter requesting that he stop allowing short-term rentals that hosted large parties and music. Officials even met with Younai that September, Feuer's office said. Despite the meeting, at least a dozen other large parties have occurred there, officials said. "All we want to do is have a peaceful existence," said Felicia Present, who lives across the street from Younai's property. Neighbors there say they've worked to gather evidence that the home remains a nuisance. They've photographed hundreds of guests flowing into the home and the many cars that clog the neighborhood's narrow streets. They've also recorded the thundering bass that causes neighboring homes to vibrate. Younai did not immediately return a request for comment Wednesday, but the property manager of the La Cuesta Drive estate, Rose Garcia, 43, said neighbors were too sensitive. "It's like if anyone talks in the pool area or sneezes, complaints are filed that have no validity," Garcia said. The property had a history of complaints before it was sold in November (the city attorney's office said the current owner is a "Belgian princess.") Garcia said she could recall only one visit from police since the property changed ownership and said there was no party going on at the time. "It's sort of an inherited issue," she said. "It's really disheartening and strange." Both Younai and Garcia are scheduled to be arraigned on Oct. 6. Younai faces 16 misdemeanors, including 10 counts of maintaining a public nuisance after receiving a written notice, and six counts of engaging in illegal short-term rentals. If convicted, he faces up to eight years in jail and $16,000 in fines. Garcia is charged with 10 misdemeanors including seven counts of maintaining a public nuisance after receiving a written notice, two counts of permitting the emission of loud and raucous noise through an amplifying device and one count of excessive noise. If convicted, she faces up to three years in jail and $7,000 in fines. Of course, complaints about loud parties in the Hollywood Hills are not new. In 1924, The Times published a front page story about a revolt by Laurel Canyon residents after a wild birthday party got out of hand. Members of the Laurel Canyon Improvement Association, the paper reported, demanded the city prosecutor file charges against the homeowner and party guests. The prosecutor vowed to take action to "discourage such affairs in the future." joseph.serna@latimes.com For breaking California news, follow @JosephSerna on Twitter. Teen stumbles on wallet loaded with $1,500 in cash, then makes a fateful decision Topless carwash raises cash for deputies wounded in gun battle at Rastafarian pot farm Man who killed Fox executive having an affair with his wife gets 11 years in prison 5 p.m.: This article was updated with additional reporting on a 1924 incident. Joseph Serna Joseph Serna is a Metro reporter who has been with the Los Angeles Times since 2012.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
2,273
Here's a selection of classic time related issues that teachers face. If you're interested, scroll down for my response. 1. The biggest challenges to teachers in terms of time management are grading/evaluations, academic commitments beyond the classroom including serving on committees, planning bodies etc.,and planning for courses. 2. Common causes of distraction for me include conversations with colleagues that don't reduce problems or workload but do provide temporary relief or stress reduction. 3. I know teachers can do their jobs without experiencing excessive stress but I am still waiting to find out how! I've spoken and worked with others who are not completely stressed out. Generally their time management skills are superb and they have been teaching enough years to have a lot of time savers in the classroom. They also ask for help when needed and don't reinvent the wheel. 5. I have two planning periods in the day so in the first period I generally have meetings, make phone calls/check emails, make copies or visit the office for administrative issues. During the second planning time I map out lessons, do internet research and complete grading. These are typical challenges that teachers have to face - how can you get everything done? You can't. Or, more accurately, you shouldn't. Most of the time, it is counter productive to strive for perfection in your work. That mindset, whilst sounding laudable, is a recipe for stress and burnout. Obviously there are times when you have to achieve 100%. But the trick is to know when those times are - and they are far less frequent than we think. Instead, decide how well you want to get something done. For example, how much research will you do online? How thoroughly do you need to grade each paper? Which emails deserve your attention and which can be deleted? How long will you allow that conversation to last? The principle to live and work by is to aim for what Swedish people refer to as "lagom" meaning "just the right amount". Sure, it takes practice. But it begins with awareness. Don't feel you have to be 100% perfect. Apply the 80-20 rule and do the things that matter most to the degree that they need to be done. This principle can be applied to anything you do.
{ "redpajama_set_name": "RedPajamaC4" }
1,616
the unilateral and unconditional nature of the new measures. freedoms and the establishment of the Rule of Law in Cuba. objectivity and endangers any political dialogue. measurable changes that will lead to the formation of a true democracy. inherent to human beings, and they have paid a high price for these demands. how to work on the legal system of labor issues and trade union freedoms. respect for human rights and the promotion of democracy. roadmap mentioned above, based on the Universal Declaration of Human Rights. the realization of a constituent assembly, define the destiny of our nation. active part of any process that seeks a solution to the Cuban conflict. unionists and opponents of various groups within the island and in exile.
{ "redpajama_set_name": "RedPajamaC4" }
7,422
We are looking for Sales Professionals to sell business to business. We sell Ads on our screens to local and national businesses. Businesses chose the most effctive spots for their ad based on demographics, geographic location, and business type. Our advertising prices are the lowest in the industry, and the rate of recall for brand name and logo recognition is the highest. This is why our customers love our product. We currently have many screens throughout Rochester, Canandaigua, Palmyra, Newark, Penn Yan, Geneva, and Waterloo. There will be many more going up soon from Buffalo to Rochester, Ithaca to Syracuse, Elmira to Binghamton, and the towns in between. We want self-motivated Sales Professionals that can work well as a commission-only 1099 contractor for bizXposure. There are no set hours, except for initial training. You can do this while working another job for additional income. Our website is used to enter all customer information, and our Graphic Design team can creates ads based on customer needs. Customers can pay monthly through the auto-pay on our website or up front, and all of the ads are started, stopped, or changed through our centrally-managed system. All you need to do is sell ads and input the information. Commission is paid when the customer pays and there is a healthy bonus program for the best Sales Professionals. Call 888-959-8213 to schedule an interview.
{ "redpajama_set_name": "RedPajamaC4" }
9,077
Moderate 1 Day Shore Excursion + Peterhof Guide For You » Shore Excursions » 1 Day St. Petersburg Tours » Moderate 1 Day Shore Excursion + Peterhof 1 Day St. Petersburg Tours Light 1 Day Shore Excursion Light 1 Day Shore Excursion + Peterhof Light 1 Day Shore Excursion + Pushkin Moderate 1 Day Shore Excursion Moderate 1 Day Shore Excursion + Pushkin (Catherine Palace) Intensive 1 Day Shore Excursion Moderate 1 Day Shore Excursion + The Faberge Museum Intensive 3 Day Shore Excursion + Moscow On this tour you will see all the major highlights of St Petersburg and visit the nearby town of Peterhof, which is 30 km away from St Petersburg. Peterhof Park Spilled Blood Canal Trip The trip to Peterhof, the former court of Peter the Great, is an unforgettable journey which takes about an hour from the city center. While driving through the city you will see its architectural style changing from the 18th century, through the Soviet era, to its modern style. Our driver will take an old road along the southern shore of the Gulf of Finland, a route which used to be compared with the one leading from Paris to Versailles. Close to the road is the official presidential residence, famous as the venue for the G8 meeting in 2006 and G20 in 2013. A short metro trip on the way back from the countryside is a good chance to see St Petersburg's famous metro stations, which decorated in the "Stalin Empire" style. From April to October more than 150 fountains can be seen in and around the Upper and Lower Parks of Peterhof. Filled with gilded statues, billions of water drops and greenery everywhere, the gardens will make you very happy! The fountains and park are included on the international UNESCO list as a unique piece of the world's heritage, and after visiting them you'll totally agree with the list's authors. The unsurpassed Hermitage Museum collection will show you a host of world-famous works of art, including those by Picasso, Rembrandt and Rubens, presented in the majestic buildings of the Winter Palace, and Small, New and Old Hermitages. You will also see the royal interiors of the former Romanov residence in the Winter Palace. On the subsequent boat tour, a wide panoramic view will introduce you to St Petersburg as a maritime capital, with its charming rivers and canals and unforgettable embankments. The Church of the Savior on Spilled Blood, one of the city's most beautiful landmarks, is a real experience to see its unique mosaic decorations, which were designed by well-known Russian artists. 9:30 —10:00 City tour 10:00 —12:00 Trip to Peterhof, including a visit to Upper and Lower Parks, one of small palaces 12:00 —13:00 Time for lunch (optional - meals are not included) 13:00 —15:30 Visit the Hermitage Museum 16:00 —16:30 Tour of The Church of the Saviour on Spilled Blood 17:00 —18:00 Canal trip by private boat 18:00 —18:30 Return to the ship On Mondays we can provide the following tours instead of the Hermitage Museum: The Russian Art Museum The Faberge Museum The Usupov Palace Optional trips outside the city: Oranienbaum Alexanria Tours of the Grand Palace in Peterhof are available on request every day except Mondays. Total number of travellers Price per Person in USD 340 257 205 173 152 141 130 121 Tours can be customized according to your taste. We can include additional excursions or performances upon request. Prices are per person, based on the group size. Prices for groups over 9 people are available on request. Pick-up and drop-off from the port Tickets to all museums as per program Transportation to museums and for excursions Guide/driver service Tour ticket (also called a blanket visa) — the document which allows you to go ashore in St. Petersburg without a visa. Free earphones for groups of 8 or more. Prices don't include Lunch (although this can be included upon request) Gratuities and tips Lunch in a high-quality Russian restaurant can be added to your tour for $18 per person. Child/Student discounts Children 6 years and under are free of charge. $20 discount per day is offered to children 7-17 years old. $15 discount per day is offered to students over the age of 17 (upon presentation of a valid ISIC card). To book your tour we don't need a prepayment. You can pay in full at the end of your tour. We accept Visa, MasterCard and American Express. Please note that when paying by card a 5% transaction fee will be charged. You may cancel your tour reservation two days before your tour begins. Concert and theatre tickets are prepaid and are non-refundable. Please note: program timing and the order in which venues are visited may change due to traffic or weather conditions. Wheelchairs are available on request. Visit SPb Tram Museum and have a memorable ride on a vintage tram through the historical attractions. Enlarge your War and Peace knowledge by joining Guide For You on the War and Peace Trail. (more…) If you are looking for something unique to do in Saint Petersburg, experience Russian Blacksmith mastership! Moderate 2 Day Shore Excursion St. Petersburg, Russia from 241$ Reserve Now & Pay Later © 2014 — 2022 Guide For You in St Petersburg
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
2,286
A young guy has feelings for a girl whose brother has been holding a grudge against him for a long time and has been seeking revenge. The other lover of the girl is her brother's friend and so he adds fuel to the flames of her brother's rage, and the result is a tragic death for both the girl and the young guy. Jafar-Khan who has returned Iran after years of studying abroad, is so unfamiliar with how things are in his country that he makes others mad and has to go back. Frustrated as he is, he puts his hope in his grandson.
{ "redpajama_set_name": "RedPajamaC4" }
9,747
For information regarding meetings, events and training all members are welcome to apply for access to the Members Area. Our thoughts and prayers are with the Phillips family and all at Morningside Volunteer Fire Department. Charlie dedicated 70 years of Volunteer service as a Life Member of Morningside VFD, past Prince Georges County Fire Commissioner, and long time member of the PGCVFRA Board of Directors. He will be missed by all. The following is the memorial and repast information for Charlie Phillips. We all need to be aware of the dangers of PTSD in the Volunteer Fire and EMS service. It's important for us to look out for each other because one 1st Responder lost to suicide is one too many. Thank you to the team from Badges United Foundation that took time for the presentation at our monthly meeting tonight. In 2018 Volunteer & Career units in Prince Georges County responded to 151,538 calls for EMS service. That is an increase of 3% over 2017 calls for service. You can take an active role in your community by serving as a Volunteer Emergency Medical Technician in one of our Voluntneer Fire & EMS Departments.
{ "redpajama_set_name": "RedPajamaC4" }
8,101
The best place to start is with an omelette. With just a couple of eggs, a few veggies and you're good to go. A good basic recipe is an omlette with ham, tomato and some mushrooms. Swap it up each morning with other great hot ingredients such as onions, sausage and green peppers. While you're at the frying pan, another great family favourite is pancakes and french toast. Easy to make, they are great complements to morning veggies such as cooked spinach, bacon or some red potatoes. Give these varieties a try and you're sure to see some happy faces on the way out of the door. If mornings aren't the best time for you and you need something you can take with you on your commute, a great classic breakfast sandwich is a great option. Grab some of that fresh bread in the pantry, cook an egg and a big of bacon, put it all together and away you go. Another great way to warm up and travel is with a buttermilk biscuit and some gravy - however probably best enjoyed if you can travel without spilling. If these ideas are all sounding delicious but you just don't have the time in the morning, or if you're just looking for a break, speak with Great Harvest, Butte, WA on (406) 723-4988 and let us take care of breakfast for you and your family.
{ "redpajama_set_name": "RedPajamaC4" }
3,889
In a move initiated and supported by local legislators responding to customer complaints, Gov. Jerry Brown signed into law Friday, authorization for an outside administrator to take over control of the embattled Sativa—Los Angeles County Water District. This includes dissolving the Board of Directors and wresting all control of the water district from the current operators. Sativa has been plagued for decades by accusations of mismanagement and a crumbling infrastructure with insufficient resources to maintain a quality water supply. That came to a head this spring when customers began posting videos of visually brown and brackish water, along with testimony of its offensive odor and taste. While the district attempted to appease customer's that these issues were being addressed and the water tested safe, the years of neglect was coming to a head, and realities and perceptions outweighed pleas for patience. In boisterous Town Hall meetings hosted by Congresswoman Nanette Barragán this past May and June, promises were made for swift action to help the districts 1,600 customers, primarily in unincorporated Willowbrook, and others in the City of Compton. In July, at the urging of Supervisors Mark Ridley-Thomas and Janice Hahn, L.A. County's Local Agency Formation Commission, LAFCO, which has oversight authority over County water agencies, voted to dissolve Sativa. But two previous efforts by LAFCO to do the same since 2005, fell flat, and with relatively little community support. But the times are changing. Reluctant to endure another drawn-out attempt and the likelihood of legal battles, State Assemblyman Mike Gipson (D-Carson) introduced Assembly Bill 1577. If signed into law it would enable the State Water Resources Control Board (SWRCB) to immediately appoint an outside administrator to take full control of the district, while completely eliminating the current Board of Directors. As an urgency law, which prevents any legal action against it, some expressed concern that this is a power play that endangers many of the thousands of small water systems in California. That was countered by others arguing Sativa is a unique example and its customers are in dire need of intervention, and safe drinking water. After sailing through both houses of the legislature, Governor Brown has now signed it into law. AB 1577, would require the SWRCB to order Sativa to accept administrative and managerial services, including full management and control, from an administrator selected by the SWRCB. The LA County Department of Public Works (DPW) is expected to be appointed until a long-term water service provider can be identified. Sativa's acting General Manager, Thomas Martin has not responded to requests for comment, as to what if any actions the district intends to take in the face of the new law.
{ "redpajama_set_name": "RedPajamaC4" }
7,566
Q: Alternative for float on moving banner I'm in the process of making all my containers stay "fullsized" and positioned correctly, when the viewport is smaller than the resolution of the website. So far everything has gone easy, except for the navbar. On resize of window, the navbars text moves. It's currently held in place with float: right; is there an alternative to float that will work in this instance? I tried defining left and right margins, but the text doesn't stay in a straight line. Website in question is: njdartistics.com @charset "utf-8"; /* CSS Document */ @font-face { font-family: 'Bebas'; src: url(Website%20Specific%20Resources/BEBAS.TTF) } body { background-image:url(Website%20Specific%20Resources/Background.png); width:1920px; margin: auto; } .outercontainer{ height:100%; width:100% } .container { width:960px; margin: 0 auto; border-left:solid #000; border-right:solid #000; background-color:#AAADAD; } .header { background-image:url(Website%20Specific%20Resources/New_Banner_.png); height: 300px; width: 100%; top: 0; position: fixed; } A { text-decoration:none; font-family: "bebas"; color: #fff; } li { list-style:none; float:left; padding-right:50px; margin-left:25px; padding-top:15px; font-size:24px; } .nav { float: inherit; position: absolute; } .content { font-family: "bebas"; width: 900px; background-color:#AAADAD; } <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Index</title> <link rel="stylesheet" type="text/css" href="styles.css"> </head> <body> <div class="header"> <div class="container"> <div class="nav"> <ul> <li><a href="#">Home </a></li> <li><a href="#">About Us </a></li> <li><a href="#">Products </a></li> <li><a href="#">Gallery </a></li> <li><a href="#">Contact Us </a></li> </ul> </div> </div> </div> <div class="outercontainer"> <div class="container"> <div class="content" > <p>asdasdasdasdaf</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>asdasdasdasdafasdasdasdasdafasdasdasdasdafasdasdasdasdafasdasda</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>sdasdafasdasdasdasdafasdasdasdasdafasdasdasdasdafasdasdasdasdafasdasdasdasdaf</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> </div> </div> </div> </body> </html> Thanks in advance. A: As I'm understanding this, you want the navbar text responsive and to stay all on one line, correct? To start, try changing the following css to something like this: body { background-image:url(Website%20Specific%20Resources/Background.png); font: 100%/1.5 serif; /* 16px */ width:1920px; margin: auto; } li { list-style:none; float:left; font-size:1.5em; font-size:3vw; padding:0 1%; width:15%; } .nav { float: inherit; position: absolute; width:100%; } Using a combination of em and vw handles the font-size (but it should be noted that vw, or viewport width, doesn't work in every browser (see here). For the browsers that don't like it, em can attempt to handle it by basing its size on whatever font size you set for the body. Alternatively (or, really, in addition--wrangle it as you please), you can use a media query to handle smaller viewport/screen sizes when it comes to the font-size. For example: @media (max-width: 600px) { body { font: 80%; } } You can size up or down based on your needs. If you set it on the body, you don't have to rewrite the li or nav css, as the em units will respond accordingly. I would avoid setting font-size in anything BUT em, as px doesn't have much flexibility. That said, after alllll this, you really should set different breakpoints for mobile devices to, for example, stack the menu or semi-hide it with the hamburger menu icon, as the text will get very, very, very small otherwise. A: header { background-image:url(Website%20Specific%20Resources/New_Banner_.png); height: 300px; width: 1920px; top: 0; position: fixed; } This was the issue. Width was 100% instead of a fixed resolution.
{ "redpajama_set_name": "RedPajamaStackExchange" }
3,355
Andy Harris & Jim Langevin Compare the voting records of Andy Harris and Jim Langevin in 2017-18. Represented Maryland's 1st Congressional District. This is his 4th term in the House. Jim Langevin Represented Rhode Island's 2nd Congressional District. This is his 9th term in the House. Andy Harris and Jim Langevin are from different parties and disagreed on 71 percent of votes in the 115th Congress (2017-18). June 15, 2018 — Stop the Importation and Trafficking of Synthetic Analogues Act May 24, 2018 — National Defense Authorization Act FY 2019 May 16, 2018 — Veterans Cemetery Benefit Correction Act Jan. 19, 2018 — Born-Alive Abortion Survivors Protection Act Sept. 26, 2017 — Increasing Opportunity and Success for Children and Parents through Evidence-Based Home Visiting Act June 29, 2017 — Kate's Law June 7, 2017 — Anti-Border Corruption Reauthorization Act Dec. 21, 2018 — Alaska Remote Generator Reliability and Protection Act Sept. 26, 2018 — Recognizing that allowing illegal immigrants the right to vote devalues the franchise and diminishes the voting power of the United States citizens Sept. 4, 2018 — Biometric Identification Transnational Migration Alert Program Authorization Act July 18, 2018 — Carbajal of California Amendment No. 81 June 19, 2018 — Stop Excessive Narcotics in our Retirement Communities Protection Act of 2018 April 26, 2018 — Lynch of Massachusetts Part A Amendment No. 87 Jan. 19, 2018 — Impeaching Donald John Trump, President of the United States, of high misdemeanors Oct. 4, 2017 — Scott of Virginia Substitute Amendment No. 2 Oct. 4, 2017 — Grijalva of Arizona Substitute Amendment No. 1 Sept. 7, 2017 — Goodlatte of Virginia Amendment No. 50 July 19, 2017 — Beyer of Virginia Part A Amendment No. 3 July 13, 2017 — Blumenauer of Oregon Part B Amendment No. 13 May 22, 2017 — Strengthening Children's Safety Act of 2017
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
6,270
J.I. Rodale, the founder of Rodale Inc., publisher of Prevention and several other magazines, was dead at the age of 72. What was even more shocking was where he died. The promoter of health in media was gone, but his legacy carried on. What does this have to do with content marketing? Everything. Not because of the interview, but because of the publishing company Rodale left behind. You see, Rodale didn't just create a magazine that "presented systematic ways people could try to prevent illness and disease." He left behind an understanding of the magazine industry that would propel Prevention to become one of the world's largest magazines (10 million monthly readers). What I want to do today is reveal how Rodale Inc. did that, and help you apply it to your own content marketing strategy. There is still an illusion to the effect that a magazine is a periodical in which advertising is incidental. But we don't look at it that way. A magazine is simply a device to induce people to read advertising. It is a large booklet with two departments – entertainment and business. The entertainment department finds stories, pictures, verse, etc. to interest the public. The business department makes the money. This simple concept is like most foundational concepts. The power isn't in knowing, it's in implementing the concepts. Rodale (and those at Rodale Inc. after him) implemented these concepts so well that it became a science. In 2014, Rodale publications reached an all-time high gross readership of 37.7 million readers. I accidentally discovered Rodale Inc.'s strategy. It all began when my wife started receiving Prevention magazine and Rodale began to send her promotional magazines (magalogs). The next thing I knew she ordered some of those books and products promoted in those magalogs. You need to understand something. Unlike me, my wife doesn't like (or even care about) marketing. For her to read these marketing materials, like them, and respond to them meant Rodale was doing something right. That led me to pay attention to its efforts and research its current and past practices. The results are revealed in the five-step strategy. Rodale Inc. never creates content for the sake of creating content. It always has a specific audience in mind. All of the content is designed to appeal, attract, and help that audience. With an often irreverent, always authoritative tone, EatClean.com will be the gathering place for clean-food insiders and experimenters, uncovering the latest trends, innovations, opinions, products, and recipes. Rodale's mindset on creating content is different from that of many marketers. Others usually focus on a generic audience. They create content for "urgent" reasons such as a blog post is scheduled; it's a topic we've wanted to discuss; it's something everybody else is talking about right now. Who is the specific audience we are trying to attract? What do they want to share? Rodale always creates content with purpose and on purpose. I once heard author and podcaster Paul Colligan say, "I am not focused on listeners. I am focused on subscribers." This should be true for content creators of all kinds. Why? Readers are great, but subscribers are what really matters. Uncover a specific audience that wants to hear from you by letting them raise their hands and identify themselves. Gain permission to contact them. Create content that is expected and anticipated by your audience. Build a deep relationship with your audience. Contact your audience for free (or be paid as the magazine subscription model provides). Email marketing has shown to be much more powerful than any social media tool. And physical mailing lists always have been an asset that every business should grow. Of course, Rodale appreciates it when people walk into a bookstore and buy one of its magazines. But Rodale isn't focused on single-copy sales; it knows subscribers offer the most value. Never forget that is true for you, no matter what type of content you are creating. This is the copy writer's task: not to create this mass desire – but to channel and direct it. Actually, it would be impossible for any one advertiser to spend enough money to actually create this mass desire. He can only exploit it. And he dies when he tries to run against it. Let me repeat. This mass desire must already be there. It must already exist. You cannot create it and you cannot fight it. But you can – and must – direct it, channel it. Focus it onto your particular product. You cannot create a desire for your product or service. It doesn't matter if you love your product. All that matters is that your prospects desire it. That's not only true for copywriters. It's true of all successful content marketers. You must know what your audiences' desires are and how to appeal to them. How can you do that? Think of your content marketing as a way to discover the desires of your audience. Write for a specific audience and pay attention to how that content resonates with them. See what gets the most comments and shares. But that's just the beginning. Talk to those subscribers on whom you've been focusing and ask what they want. Rodale constantly studies the desires of its audiences. It looks at trends. It even conducts an annual survey to get consumers' reactions to specific areas and types of information. People are more in tune than ever with the relationship between what they eat, how they look and feel, and how our food affects the planet. Everywhere you shop or eat now, you can find some food label – whether something is sustainably sourced, real, natural, pure, non-GMO, local, gluten-free – that raises more questions than it answers. Eat Clean helps people negotiate this crazy landscape and figure out how to 'eat clean' in a way that makes the most sense for their own health, goals, and ethics. Rodale understands that unless it knows what a specific audience wants, it can never create solutions that they will buy. You must never forget this. It's the key to success in sales conversions. Once you know the audience's desires, you can create the things that they want to buy or position what you already offer in ways that appeal to them. People have different content-consumption preferences. Some types of content have more perceived value than others. With this in mind, Rodale created all kinds of different info-products, products, and services to appeal not only to the different desires of its audiences but to the different ways the audiences want to consume information. In addition to magazines, Rodale offers books, products, an online university, and host-branded events. Now obviously, you don't want to create all of these solutions at once. Start with one idea and build from there. How can you get ideas on where to start? Look to the content you've created for format ideas. All of the content you create must be designed to attract and help that hungry crowd with money. Your products and services must meet the needs and/or desires that a crowd of people with money has. Rodale knows that to help your audience the most, and at the same time maximize your profits, you must create many types of solutions for your prospects to buy. If you want to maximize your sales, then you must do the same. The other thing Rodale does that many content marketers do not do is they rely on more than content marketing — they also rely on content selling. Content marketing can take you far but without an offer or ask, it's difficult to get a sale. At some point, you must focus on content that intentionally sells. What's the difference? I recently heard Todd Brown of MarketingFunnelAutomation.com give this powerful explanation on the difference between marketing and selling. I'll paraphrase it: Marketing is when you talk about the prospects – the prospect's situation, needs, wants, and what's in the best interest of your prospect. In other words, what they should be doing to alleviate their problem … Selling is when you talk about you, your product, your product's benefits, features, advantages, risk-reversal, bonuses – that's selling. In other words, content marketing is content focused on attracting members of an audience, educating them, and helping them. Content selling is content focused on why the audience members should buy what you are offering. Rodale doesn't just focus on content to attract its audiences (magazines = content marketing). It also is focused on content that offers a solution (magalogs and sales letters = content selling). That is why Rodale is so successful. That's why my wife has purchased things from the company. Its magazines alone would never have been able to accomplish this success. Too many content marketers never ask for the sale. They waste much of their content marketing efforts. Rodale never loses sight of the two purposes of a magazine – to gather an audience and to make a profit. This balanced focus and the powerful way it is implemented are the keys to Rodale's amazing success. If you want to see better results, then you must do the same. This five-step strategy can act as a template to improve your content marketing strategy. Answer these questions to see which step you need to focus on. Once you can answer yes to a question, proceed to the next. Have I determined the specific audience on which we are focused? Have I focused sufficiently on subscribers? Do I know the existing desires of this audience? Have I created enough different types of solutions to sell to them? Have I created content that encourages these audience members to purchase and explained why they should? Following Rodale's model, you can ask the right questions to develop the best answers to create a well-informed content marketing strategy. Want some more assistance to enhance your content marketing strategy and structure your team for more effective content marketing. Read CMI's e-book: Building the Perfect Content Marketing Mix.
{ "redpajama_set_name": "RedPajamaC4" }
1,516
Triton Station A Blog About the Science and Sociology of Cosmology and Dark Matter J Mark Morris on Statistical detection of the e… Luís Lopes on Golden Webinar Laurence Cox on Golden Webinar Tom Andersen on Golden Webinar tritonstation on Golden Webinar Previous Posts Select Month January 2021 December 2020 October 2020 July 2020 June 2020 May 2020 January 2020 December 2019 June 2019 January 2019 November 2018 October 2018 September 2018 August 2018 July 2018 June 2018 April 2018 December 2017 November 2017 September 2017 July 2017 June 2017 May 2017 April 2017 March 2017 February 2017 January 2017 December 2016 November 2016 October 2016 September 2016 August 2016 July 2016 June 2016 May 2016 April 2016 Dwarf satellite galaxies Emergent Gravity Galaxy Evolution Galaxy Formation Rotation curves Stellar Populations Golden Webinar A Stellar Population Mystery in a Low Surface Brightness Galaxy 25 years a heretic Statistical detection of the external field effect from large scale structure Oh… you don't want to look in there Category: LCDM Big Trouble in a Deep Void The following is a guest post by Indranil Banik, Moritz Haslbauer, and Pavel Kroupa (bios at end) based on their new paper Modifying gravity to save cosmology Cosmology is currently in a major crisis because of many severe tensions, the most serious and well-known being that local observations of how quickly the Universe is expanding (the so-called 'Hubble constant') exceed the prediction of the standard cosmological model, ΛCDM. This prediction is based on the cosmic microwave background (CMB), the most ancient light we can observe – which is generally thought to have been emitted about 400,000 years after the Big Bang. For ΛCDM to fit the pattern of fluctuations observed in the CMB by the Planck satellite and other experiments, the Hubble constant must have a particular value of 67.4 ± 0.5 km/s/Mpc. Local measurements are nearly all above this 'Planck value', but are consistent with each other. In our paper, we use a local value of 73.8 ± 1.1 km/s/Mpc using a combination of supernovae and gravitationally lensed quasars, two particularly precise yet independent techniques. This unexpectedly rapid local expansion of the Universe could be due to us residing in a huge underdense region, or void. However, a void wide and deep enough to explain the Hubble tension is not possible in ΛCDM, which is built on Einstein's theory of gravity, General Relativity. Still, there is quite strong evidence that we are indeed living within a large void with a radius of about 300 Mpc, or one billion light years. This evidence comes from many surveys covering the whole electromagnetic spectrum, from radio to X-rays. The most compelling evidence comes from analysis of galaxy number counts in the near-infrared, giving the void its name of the Keenan-Barger-Cowie (KBC) void. Gravity from matter outside the void would pull more than matter inside it, making the Universe appear to expand faster than it actually is for an observer inside the void. This 'Hubble bubble' scenario (depicted in Figure 1) could solve the Hubble tension, a possibility considered – and rejected – in several previous works (e.g. Kenworthy+ 2019). We will return to their objections against this idea. Figure 1: Illustration of the Universe's large scale structure. The darker regions are voids, and the bright dots represent galaxies. The arrows show how gravity from surrounding denser regions pulls outwards on galaxies in a void. If we were living in such a void (as indicated by the yellow star), the Universe would expand faster locally than it does on average. This could explain the Hubble tension. Credit: Technology Review One of the main objections seemed to be that since such a large and deep void is incompatible with ΛCDM, it can't exist. This is a common way of thinking, but the problem with it was clear to us from a very early stage. The first part of this logic is sound – assuming General Relativity, a hot Big Bang, and that the state of the Universe at early times is apparent in the CMB (i.e. it was flat and almost homogeneous then), we are led to the standard flat ΛCDM model. By studying the largest suitable simulation of this model (called MXXL), we found that it should be completely impossible to find ourselves inside a void with the observed size and depth (or fractional underdensity) of the KBC void – this possibility can be rejected with more confidence than the discovery of the Higgs boson when first announced. We therefore applied one of the leading alternative gravity theories called Milgromian Dynamics (MOND), a controversial idea developed in the early 1980s by Israeli physicist Mordehai Milgrom. We used MOND (explained in a simple way here) to evolve a small density fluctuation forwards from early times, studying if 13 billion years later it fits the density and velocity field of the local Universe. Before describing our results, we briefly introduce MOND and explain how to use it in a potentially viable cosmological framework. Astronomers often assume MOND cannot be extended to cosmological scales (typically >10 Mpc), which is probably true without some auxiliary assumptions. This is also the case for General Relativity, though in that case the scale where auxiliary assumptions become crucial is only a few kpc, namely in galaxies. MOND was originally designed to explain why galaxies rotate faster in their outskirts than they should if one applies General Relativity to their luminous matter distribution. This discrepancy gave rise to the idea of dark matter halos around individual galaxies. For dark matter to cluster on such scales, it would have to be 'cold', or equivalently consist of rather heavy particles (above a few thousand eV/c2, or a millionth of a proton mass). Any lighter and the gravity from galaxies could not hold on to the dark matter. MOND assumes these speculative and unexplained cold dark matter haloes do not exist – the need for them is after all dependent on the validity of General Relativity. In MOND once the gravity from any object gets down to a certain very low threshold called a0, it declines more gradually with increasing distance, following an inverse distance law instead of the usual inverse square law. MOND has successfully predicted many galaxy rotation curves, highlighting some remarkable correlations with their visible mass. This is unexpected if they mostly consist of invisible dark matter with quite different properties to visible mass. The Local Group satellite galaxy planes also strongly favour MOND over ΛCDM, as explained using the logic of Figure 2 and in this YouTube video. Figure 2: the satellite galaxies of the Milky Way and Andromeda mostly lie within thin planes. These are difficult to form unless the galaxies in them are tidal dwarfs born from the interaction of two major galaxies. Since tidal dwarfs should be free of dark matter due to the way they form, the satellites in the satellite planes should have rather weak self-gravity in ΛCDM. This is not the case as measured from their high internal velocity dispersions. So the extra gravity needed to hold galaxies together should not come from dark matter that can in principle be separated from the visible. To extend MOND to cosmology, we used what we call the νHDM framework (with ν pronounced "nu"), originally proposed by Angus (2009). In this model, the cold dark matter of ΛCDM is replaced by the same total mass in sterile neutrinos with a mass of only 11 eV/c2, almost a billion times lighter than a proton. Their low mass means they would not clump together in galaxies, consistent with the original idea of MOND to explain galaxies with only their visible mass. This makes the extra collisionless matter 'hot', hence the name of the model. But this collisionless matter would exist inside galaxy clusters, helping to explain unusual configurations like the Bullet Cluster and the unexpectedly strong gravity (even in MOND) in quieter clusters. Considering the universe as a whole, νHDM has the same overall matter content as ΛCDM. This makes the overall expansion history of the universe very similar in both models, so both can explain the amounts of deuterium and helium produced in the first few minutes after the Big Bang. They should also yield similar fluctuations in the CMB because both models contain the same amount of dark matter. These fluctuations would get somewhat blurred by sterile neutrinos of such a low mass due to their rather fast motion in the early Universe. However, it has been demonstrated that Planck data are consistent with dark matter particles more massive than 10 eV/c2. Crucially, we showed that the density fluctuations evident in the CMB typically yield a gravitational field strength of 21 a0 (correcting an earlier erroneous estimate of 570 a0 in the above paper), making the gravitational physics nearly identical to General Relativity. Clearly, the main lines of early Universe evidence used to argue in favour of ΛCDM are not sufficiently unique to distinguish it from νHDM (Angus 2009). The models nonetheless behave very differently later on. We estimated that for redshifts below about 50 (when the Universe is older than about 50 million years), the gravity would typically fall below a0 thanks to the expansion of the Universe (the CMB comes from a redshift of 1100). After this 'MOND moment', both the ordinary matter and the sterile neutrinos would clump on large scales just like in ΛCDM, but there would also be the extra gravity from MOND. This would cause structures to grow much faster (Figure 3), allowing much wider and deeper voids. Figure 3: Evolution of the density contrast within a 300 co-moving Mpc sphere in different Newtonian (red) and MOND (blue) models, shown as a function of the Universe's size relative to its present size (this changes almost linearly with time). Notice the much faster structure growth in MOND. The solid blue line uses a time-independent external field on the void, while the dot-dashed blue line shows the effect of a stronger external field in the past. This requires a deeper initial void to match present-day observations. We used this basic framework to set up a dynamical model of the void. By making various approximations and trying different initial density profiles, we were able to simultaneously fit the apparent local Hubble constant, the observed density profile of the KBC void, and many other observables like the acceleration parameter, which we come to below. We also confirmed previous results that the same observables rule out standard cosmology at 7.09σ significance. This is much more than the typical threshold of 5σ used to claim a discovery in cases like the Higgs boson, where the results agree with prior expectations. One objection to our model was that a large local void would cause the apparent expansion of the Universe to accelerate at late times. Equivalently, observations that go beyond the void should see a standard Planck cosmology, leading to a step-like behaviour near the void edge. At stake is the so-called acceleration parameter q0 (which we defined oppositely to convention to correct a historical error). In ΛCDM, we expect q0 = 0.55, while in general much higher values are expected in a Hubble bubble scenario. The objection of Kenworthy+ (2019) was that since the observed q0 is close to 0.55, there is no room for a void. However, their data analysis fixed q0 to the ΛCDM expectation, thereby removing any hope of discovering a deviation that might be caused by a local void. Other analyses (e.g. Camarena & Marra 2020b) which do not make such a theory-motivated assumption find q0 = 1.08, which is quite consistent with our best-fitting model (Figure 4). We also discussed other objections to a large local void, for instance the Wu & Huterer (2017) paper which did not consider a sufficiently large void, forcing the authors to consider a much deeper void to try and solve the Hubble tension. This led to some serious observational inconsistencies, but a larger and shallower void like the observed KBC void seems to explain the data nicely. In fact, combining all the constraints we applied to our model, the overall tension is only 2.53σ, meaning the data have a 1.14% chance of arising if ours were the correct model. The actual observations are thus not the most likely consequence of our model, but could plausibly arise if it were correct. Given also the high likelihood that some if not all of the observational errors we took from publications are underestimates, this is actually a very good level of consistency. Figure 4: The predicted local Hubble constant (x-axis) and acceleration parameter (y-axis) as measured with local supernovae (black dot, with red error ellipses). Our best-fitting models with different initial void density profiles (blue symbols) can easily explain the observations. However, there is significant tension with the prediction of ΛCDM based on parameters needed to fit Planck observations of the CMB (green dot). In particular, local observations favour a higher acceleration parameter, suggestive of a local void. Unlike other attempts to solve the Hubble tension, ours is unique in using an already existing theory (MOND) developed for a different reason (galaxy rotation curves). The use of unseen collisionless matter made of hypothetical sterile neutrinos is still required to explain the properties of galaxy clusters, which otherwise do not sit well with MOND. In addition, these neutrinos provide an easy way to explain the CMB and background expansion history, though recently Skordis & Zlosnik (2020) showed that this is possible in MOND with only ordinary matter. In any case, MOND is a theory of gravity, while dark matter is a hypothesis that more matter exists than meets the eye. The ideas could both be right, and should be tested separately. A dark matter-MOND hybrid thus appears to be a very promising way to resolve the current crisis in cosmology. Still, more work is required to construct a fully-fledged relativistic MOND theory capable of addressing cosmology. This could build on the theory proposed by Skordis & Zlosnik (2019) in which gravitational waves travel at the speed of light, which was considered to be a major difficulty for MOND. We argued that such a theory would enhance structure formation to the required extent under a wide range of plausible theoretical assumptions, but this needs to be shown explicitly starting from a relativistic MOND theory. Cosmological structure formation simulations are certainly required in this scenario – these are currently under way in Bonn. Further observations would also help greatly, especially of the matter density in the outskirts of the KBC void at distances of about 500 Mpc. This could hold vital clues to how quickly the void has grown, helping to pin down the behaviour of the sought-after MOND theory. There is now a very real prospect of obtaining a single theory that works across all astronomical scales, from the tiniest dwarf galaxies up to the largest structures in the Universe & its overall expansion rate, and from a few seconds after the birth of the Universe until today. Rather than argue whether this theory looks more like MOND or standard cosmology, what we should really do is combine the best elements of both, paying careful attention to all observations. Indranil Banik is a Humboldt postdoctoral fellow in the Helmholtz Institute for Radiation and Nuclear Physics (HISKP) at the University of Bonn, Germany. He did his undergraduate and masters at Trinity College, Cambridge, and his PhD at Saint Andrews under Hongsheng Zhao. His research focuses on testing whether gravity continues to follow the Newtonian inverse square law at the low accelerations typical of galactic outskirts, with MOND being the best-developed alternative. Moritz Haslbauer is a PhD student at the Max Planck Institute for Radio Astronomy (MPIfR) in Bonn. He obtained his undergraduate degree from the University of Vienna and his masters from the University of Bonn. He works on the formation and evolution of galaxies and their distribution in the local Universe in order to test different cosmological models and gravitational theories. Prof. Pavel Kroupa is his PhD supervisor. Pavel Kroupa is a professor at the University of Bonn and professorem hospitem at Charles University in Prague. He went to school in Germany and South Africa, studied physics in Perth, Australia, and obtained his PhD at Trinity College, Cambridge, UK. He researches stellar populations and their dynamics as well as the dark matter problem, therewith testing gravitational theories and cosmological models. Link to the published science paper. YouTube video on the paper Contact: ibanik@astro.uni-bonn.de. Indranil Banik's YouTube channel. By tritonstationin Cosmology, Dark Matter, LCDM, MOND October 23, 2020 November 19, 2020 2,543 Words48 Comments Cosmology, then and now I have been busy teaching cosmology this semester. When I started on the faculty of the University of Maryland in 1998, there was no advanced course on the subject. This seemed like an obvious hole to fill, so I developed one. I remember with fond bemusement the senior faculty, many of them planetary scientists, sending Mike A'Hearn as a stately ambassador to politely inquire if cosmology had evolved beyond a dodgy subject and was now rigorous enough to be worthy of a 3 credit graduate course. Back then, we used transparencies or wrote on the board. It was novel to have a course web page. I still have those notes, and marvel at the breadth and depth of work performed by my younger self. Now that I'm teaching it for the first time in a decade, I find it challenging to keep up. Everything has to be adapted to an electronic format, and be delivered remotely during this damnable pandemic. It is a less satisfactory experience, and it has precluded posting much here. Another thing I notice is that attitudes have evolved along with the subject. The baseline cosmology, LCDM, has not changed much. We've tilted the power spectrum and spiked it with extra baryons, but the basic picture is that which emerged from the application of classical observational cosmology – measurements of the Hubble constant, the mass density, the ages of the oldest stars, the abundances of the light elements, number counts of faint galaxies, and a wealth of other observational constraints built up over decades of effort. Here is an example of combining such constraints, and exercise I have students do every time I teach the course: Observational constraints in the mass density-Hubble constant plane assembled by students in my cosmology course in 2002. The gray area is excluded. The open window is the only space allowed; this is LCDM. The box represents the first WMAP estimate in 2003. CMB estimates have subsequently migrated out of the allowed region to lower H0 and higher mass density, but the other constraints have not changed much, most famously H0, which remains entrenched in the low to mid-70s. These things were known by the mid-90s. Nowadays, people seem to think Type Ia SN discovered Lambda, when really they were just icing on a cake that was already baked. The location of the first peak in the acoustic power spectrum of the microwave background was corroborative of the flat geometry required by the picture that had developed, but trailed the development of LCDM rather than informing its construction. But students entering the field now seem to have been given the impression that these were the only observations that mattered. Worse, they seem to think these things are Known, as if there's never been a time that we cosmologists have been sure about something only to find later that we had it quite wrong. This attitude is deleterious to the progress of science, as it precludes us from seeing important clues when they fail to conform to our preconceptions. To give one recent example, everyone seems to have decided that the EDGES observation of 21 cm absorption during the dark ages is wrong. The reason? Because it is impossible in LCDM. There are technical reasons why it might be wrong, but these are subsidiary to Attitude: we can't believe it's true, so we don't. But that's what makes a result important: something that makes us reexamine how we perceive the universe. If we're unwilling to do that, we're no longer doing science. By tritonstationin Cosmology, LCDM, Personal Experience, Philosophy of Science, Sociology October 1, 2020 October 7, 2020 586 Words46 Comments A Significant Theoretical Advance The missing mass problem has been with us many decades now. Going on a century if you start counting from the work of Oort and Zwicky in the 1930s. Not quite a half a century if we date it from the 1970s when most of the relevant scientific community started to take it seriously. Either way, that's a very long time for a major problem to go unsolved in physics. The quantum revolution that overturned our classical view of physics was lightning fast in comparison – see the discussion of Bohr's theory in the foundation of quantum mechanics in David Merritt's new book. To this day, despite tremendous efforts, we have yet to obtain a confirmed laboratory detection of a viable dark matter particle – or even a hint of persuasive evidence for the physics beyond the Standard Model of Particle Physics (e.g., supersymmetry) that would be required to enable the existence of such particles. We cannot credibly claim (as many of my colleagues insist they can) to know that such invisible mass exists. All we really know is that there is a discrepancy between what we see and what we get: the universe and the galaxies within it cannot be explained by General Relativity and the known stable of Standard Model particles. If we assume that General Relativity is both correct and sufficient to explain the universe, which seems like a very excellent assumption, then we are indeed obliged to invoke non-baryonic dark matter. The amount of astronomical evidence that points in this direction is overwhelming. That is how we got to where we are today: once we make the obvious, imminently well-motivated assumption, then we are forced along a path in which we become convinced of the reality of the dark matter, not merely as a hypothetical convenience to cosmological calculations, but as an essential part of physical reality. I think that the assumption that General Relativity is correct is indeed an excellent one. It has repeatedly passed many experimental and observational tests too numerous to elaborate here. However, I have come to doubt the assumption that it suffices to explain the universe. The only data that test it on scales where the missing mass problem arises is the data from which we infer the existence of dark matter. Which we do by assuming that General Relativity holds. The opportunity for circular reasoning is apparent – and frequently indulged. It should not come as a shock that General Relativity might not be completely sufficient as a theory in all circumstances. This is exactly the motivation for and the working presumption of quantum theories of gravity. That nothing to do with cosmology will be affected along the road to quantum gravity is just another assumption. I expect that some of my colleagues will struggle to wrap their heads around what I just wrote. I sure did. It was the hardest thing I ever did in science to accept that I might be wrong to be so sure it had to be dark matter – because I was sure it was. As sure of it as any of the folks who remain sure of it now. So imagine my shock when we obtained data that made no sense in terms of dark matter, but had been predicted in advance by a completely different theory, MOND. When comparing dark matter and MOND, one must weigh all evidence in the balance. Much of the evidence is gratuitously ambiguous, so the conclusion to which one comes depends on how one weighs the more definitive lines of evidence. Some of this points very clearly to MOND, while other evidence prefers non-baryonic dark matter. One of the most important lines of evidence in favor of dark matter is the acoustic power spectrum of the cosmic microwave background (CMB) – the pattern of minute temperature fluctuations in the relic radiation field imprinted on the sky a few hundred thousand years after the Big Bang. The equations that govern the acoustic power spectrum require General Relativity, but thankfully the small amplitude of the temperature variations permits them to be solved in the limit of linear perturbation theory. So posed, they can be written as a damped and driven oscillator. The power spectrum favors features corresponding to standing waves at the epoch of recombination when the universe transitioned rather abruptly from an opaque plasma to a transparent neutral gas. The edge of a cloud provides an analog: light inside the cloud scatters off the water molecules and doesn't get very far: the cloud is opaque. Any light that makes it to the edge of the cloud meets no further resistance, and is free to travel to our eyes – which is how we perceive the edge of the cloud. The CMB is the expansion-redshifted edge of the plasma cloud of the early universe. An easy way to think about a damped and a driven oscillator is a kid being pushed on a swing. The parent pushing the child is a driver of the oscillation. Any resistance – like the child dragging his feet – damps the oscillation. Normal matter (baryons) damps the oscillations – it acts as a net drag force on the photon fluid whose oscillations we observe. If there is nothing going on but General Relativity plus normal baryons, we should see a purely damped pattern of oscillations in which each peak is smaller than the one before it, as seen in the solid line here: The CMB acoustic power spectrum predicted by General Relativity with no cold dark matter (line) and as observed by the Planck satellite (data points). As one can see, the case of no Cold Dark Matter (CDM) does well to explain the amplitudes of the first two peaks. Indeed, it was the only hypothesis to successfully predict this aspect of the data in advance of its observation. The small amplitude of the second peak came as a great surprise from the perspective of LCDM. However, without CDM, there is only baryonic damping. Each peak should have a progressively lower amplitude. This is not observed. Instead, the third peak is almost the same amplitude as the second, and clearly higher than expected in the pure damping scenario of no-CDM. CDM provides a net driving force in the oscillation equations. It acts like the parent pushing the kid. Even though the kid drags his feet, the parent keeps pushing, and the amplitude of the oscillation is maintained. For the third peak at any rate. The baryons are an intransigent child and keep dragging their feet; eventually they win and the power spectrum damps away on progressively finer angular scales (large 𝓁 in the plot). As I wrote in this review, the excess amplitude of the third peak over the no-CDM prediction is the best evidence to my mind in favor of the existence of non-baryonic CDM. Indeed, this observation is routinely cited by many cosmologists to absolutely require dark matter. It is argued that the observed power spectrum is impossible without it. The corollary is that any problem the dark matter picture encounters is a mere puzzle. It cannot be an anomaly because the CMB tells us that CDM has to exist. Impossible is a high standard. I hope the reader can see the flaw in this line of reasoning. It is the same as above. In order to compute the oscillation power spectrum, we have assumed General Relativity. While not replacing it, the persistent predictive successes of a theory like MOND implies the existence of a more general theory. We do not know that such a theory cannot explain the CMB until we develop said theory and work out its predictions. That said, it is a tall order. One needs a theory that provides a significant driving term without a large amount of excess invisible mass. Something has to push the swing in a universe full of stuff that only drags its feet. That does seem nigh on impossible. Or so I thought until I heard a talk by Pedro Ferreira where he showed how the scalar field in TeVeS – the relativistic MONDian theory proposed by Bekenstein – might play the same role as CDM. However, he and his collaborators soon showed that the desired effect was indeed impossible, at least in TeVeS: one could not simultaneously fit the third peak and the data preceding the first. This was nevertheless an important theoretical development, as it showed how it was possible, at least in principle, to affect the peak ratios without massive amounts of non-baryonic CDM. At this juncture, there are two options. One is to seek a theory that might work, and develop it to the point where it can be tested. This is a lot of hard work that is bound to lead one down many blind alleys without promise of ultimate success. The much easier option is to assume that it cannot be done. This is the option adopted by most cosmologists, who have spent the last 15 years arguing that the CMB power spectrum requires the existence of CDM. Some even seem to consider it to be a detection thereof, in which case we might wonder why we bother with all those expensive underground experiments to detect the stuff. Rather fewer people have invested in the approach that requires hard work. There are a few brave souls who have tried it; these include Constantinos Skordis and Tom Złosnik. Very recently, the have shown a version of a relativistic MOND theory (which they call RelMOND) that does fit the CMB power spectrum. Here is the plot from their paper: Note that black line in their plot is the fit of the LCDM model to the Planck power spectrum data. Their theory does the same thing, so it necessarily fits the data as well. Indeed, a good fit appears to follow for a range of parameters. This is important, because it implies that little or no fine-tuning is needed: this is just what happens. That is arguably better than the case for LCDM, in which the fit is very fine-tuned. Indeed, that was a large point of making the measurement, as it requires a very specific set of parameters in order to work. It also leads to tensions with independent measurements of the Hubble constant, the baryon density, and the amplitude of the matter power spectrum at low redshift. As with any good science result, this one raises a host of questions. It will take time to explore these. But this in itself is a momentous result. Irrespective if RelMOND is the right theory or, like TeVeS, just a step on a longer path, it shows that the impossible is in fact possible. The argument that I have heard repeated by cosmologists ad nauseam like a rosary prayer, that dark matter is the only conceivable way to explain the CMB power spectrum, is simply WRONG. By tritonstationin Cosmology, Dark Matter, LCDM, MOND July 13, 2020 July 16, 2020 1,823 Words58 Comments The Hubble Constant from the Baryonic Tully-Fisher Relation The distance scale is fundamental to cosmology. How big is the universe? is pretty much the first question we ask when we look at the Big Picture. The primary yardstick we use to describe the scale of the universe is Hubble's constant: the H0 in v = H0 D that relates the recession velocity (redshift) of a galaxy to its distance. More generally, this is the current expansion rate of the universe. Pick up any book on cosmology and you will find a lengthy disquisition on the importance of this fundamental parameter that encapsulates the size, age, critical density, and potential fate of the cosmos. It is the first of the Big Two numbers in cosmology that expresses the still-amazing fact that the entire universe is expanding. Quantifying the distance scale is hard. Throughout my career, I have avoided working on it. There are quite enough, er, personalities on the case already. No need for me to add to the madness. Not that I couldn't. The Tully-Fisher relation has long been used as a distance indicator. It played an important role in breaking the stranglehold that H0 = 50 km/s/Mpc had on the minds of cosmologists, including myself. Tully & Fisher (1977) found that it was approximately 80 km/s/Mpc. Their method continues to provide strong constraints to this day: Kourkchi et al. find H0 = 76.0 ± 1.1(stat) ± 2.3(sys) km s-1 Mpc-1. So I've been happy to stay out of it. I am motivated in part by the calibration opportunity provided by gas rich galaxies, in part by the fact that tension in independent approaches to constrain the Hubble constant only seems to be getting worse, and in part by a recent conference experience. (Remember when we traveled?) Less than a year ago, I was at a cosmology conference in which I heard an all-too-typical talk that asserted that the Planck H0 = 67.4 ± 0.5 km/s/Mpc had to be correct and everybody who got something different was a stupid-head. I've seen this movie before. It is the same community (often the very same people) who once insisted that H0 had to be 50, dammit. They're every bit as overconfident as before, suffering just as much from confirmation bias (LCDM! LCDM! LCDM!), and seem every bit as likely to be correct this time around. So, is it true? We have the data, we've just refrained from using it in this particular way because other people were on the case. Let's check. The big hassle here is not measuring H0 so much as quantifying the uncertainties. That's the part that's really hard. So all credit goes to Jim Schombert, who rolled up his proverbial sleeves and did all the hard work. Federico Lelli and I mostly just played the mother-of-all-jerks referees (I've had plenty of role models) by asking about every annoying detail. To make a very long story short, none of the items under our control matter at a level we care about, each making < 1 km/s/Mpc difference to the final answer. In principle, the Baryonic Tully-Fisher relation (BTFR) helps over the usual luminosity-based version by including the gas, which extends application of the relation to lower mass galaxies that can be quite gas rich. Ignoring this component results in a mess that can only be avoided by restricting attention to bright galaxies. But including it introduces an extra parameter. One has to adopt a stellar mass-to-light ratio to put the stars and the gas on the same footing. I always figured that would make things worse – and for a long time, it did. That is no longer the case. So long as we treat the calibration sample that defines the BTFR and the sample used to measure the Hubble constant self-consistently, plausible choices for the mass-to-light ratio return the same answer for H0. It's all relative – the calibration changes with different choices, but the application to more distant galaxies changes in the same way. Same for the treatment of molecular gas and metallicity. It all comes out in the wash. Our relative distance scale is very precise. Putting an absolute number on it simply requires a lot of calibrating galaxies with accurate, independently measured distances. Here is the absolute calibration of the BTFR that we obtain: The Baryonic Tully-Fisher relation calibrated with 50 galaxies with direct distance determinations from either the Tip of the Red Giant Branch method (23) or Cepheids (27). In constructing this calibrated BTFR, we have relied on distance measurements made or compiled by the Extragalactic Distance Database, which represents the cumulative efforts of Tully and many others to map out the local universe in great detail. We have also benefited from the work of Ponomareva et al, which provides new calibrator galaxies not already in our SPARC sample. Critically, they also measure the flat velocity from rotation curves, which is a huge improvement in accuracy over the more readily available linewidths commonly employed in Tully-Fisher work, but is expensive to obtain so remains the primary observational limitation on this procedure. Still, we're in pretty good shape. We now have 50 galaxies with well measured distances as well as the necessary ingredients to construct the BTFR: extended, resolved rotation curves, HI fluxes to measure the gas mass, and Spitzer near-IR data to estimate the stellar mass. This is a huge sample for which to have all of these data simultaneously. Measuring distances to individual galaxies remains challenging and time-consuming hard work that has been done by others. We are not about to second-guess their results, but we can note that they are sensible and remarkably consistent. There are two primary methods by which the distances we use have been measured. One is Cepheids – the same type of variable stars that Hubble used to measure the distance to spiral nebulae to demonstrate their extragalactic nature. The other is the tip of the red giant branch (TRGB) method, which takes advantage of the brightest red giants having nearly the same luminosity. The sample is split nearly 50/50: there are 27 galaxies with a Cepheid distance measurement, and 23 with the TRGB. The two methods (different colored points in the figure) give the same calibration, within the errors, as do the two samples (circles vs. diamonds). There have been plenty of mistakes in the distance scale historically, so this consistency is important. There are many places where things could go wrong: differences between ourselves and Ponomareva, differences between Cepheids and the TRGB as distance indicators, mistakes in the application of either method to individual galaxies… so many opportunities to go wrong, and yet everything is consistent. Having followed the distance scale problem my entire career, I cannot express how deeply impressive it is that all these different measurements paint a consistent picture. This is a credit to a large community of astronomers who have worked diligently on this problem for what seems like aeons. There is a temptation to dismiss distance scale work as having been wrong in the past, so it can be again. Of course that is true, but it is also true that matters have improved considerably. Forty years ago, it was not surprising when a distance indicator turned out to be wrong, and distances changed by a factor of two. That stopped twenty years ago, thanks in large part to the Hubble Space Telescope, a key goal of which had been to nail down the distance scale. That mission seems largely to have been accomplished, with small differences persisting only at the level that one expects from experimental error. One cannot, for example, make a change to the Cepheid calibration without creating a tension with the TRGB data, or vice-versa: both have to change in concert by the same amount in the same direction. That is unlikely to the point of wishful thinking. Having nailed down the absolute calibration of the BTFR for galaxies with well-measured distances, we can apply it to other galaxies for which we know the redshift but not the distance. There are nearly 100 suitable galaxies available in the SPARC database. Consistency between them and the calibrator galaxies requires H0 = 75.1 +/- 2.3 (stat) +/- 1.5 (sys) km/s/Mpc. This is consistent with the result for the standard luminosity-linewidth version of the Tully-Fisher relation reported by Kourkchi et al. Note also that our statistical (random/experimental) error is larger, but our systematic error is smaller. That's because we have a much smaller number of galaxies. The method is, in principle, more precise (mostly because rotation curves are more accurate than linewidhts), so there is still a lot to be gained by collecting more data. Our measurement is also consistent with many other "local" measurements of the distance scale, but not with "global" measurements. See the nice discussion by Telescoper and the paper from which it comes. A Hubble constant in the 70s is the answer that we've consistently gotten for the past 20 years by a wide variety of distinct methods, including direct measurements that are not dependent on lower rungs of the distance ladder, like gravitational lensing and megamasers. These are repeatable experiments. In contrast, as I've pointed out before, it is the "global" CMB-fitted value of the Hubble parameter that has steadily diverged from the concordance region that originally established LCDM. So, where does this leave us? In the past, it was easy to dismiss a tension of this sort as due to some systematic error, because that happened all the time – in the 20th century. That's not so true anymore. It looks to me like the tension is real. By tritonstationin Cosmology, Data Interpretation, LCDM June 17, 2020 1,589 Words73 Comments The halo mass function I haven't written much here of late. This is mostly because I have been busy, but also because I have been actively refraining from venting about some of the sillier things being said in the scientific literature. I went into science to get away from the human proclivity for what is nowadays called "fake news," but we scientists are human too, and are not immune from the same self-deception one sees so frequently exercised in other venues. So let's talk about something positive. Current grad student Pengfei Li recently published a paper on the halo mass function. What is that and why should we care? One of the fundamental predictions of the current cosmological paradigm, ΛCDM, is that dark matter clumps into halos. Cosmological parameters are known with sufficient precision that we have a very good idea of how many of these halos there ought to be. Their number per unit volume as a function of mass (so many big halos, so many more small halos) is called the halo mass function. An important test of the paradigm is thus to measure the halo mass function. Does the predicted number match the observed number? This is hard to do, since dark matter halos are invisible! So how do we go about it? Galaxies are thought to form within dark matter halos. Indeed, that's kinda the whole point of the ΛCDM galaxy formation paradigm. So by counting galaxies, we should be able to count dark matter halos. Counting galaxies was an obvious task long before we thought there was dark matter, so this should be straightforward: all one needs is the measured galaxy luminosity function – the number density of galaxies as a function of how bright they are, or equivalently, how many stars they are made of (their stellar mass). Unfortunately, this goes tragically wrong. Fig. 5 from the review by Bullock & Boylan-Kolchin. The number density of objects is shown as a function of their mass. Colored points are galaxies. The solid line is the predicted number of dark matter halos. The dotted line is what one would expect for galaxies if all the normal matter associated with each dark matter halo turned into stars. This figure shows a comparison of the observed stellar mass function of galaxies and the predicted halo mass function. It is from a recent review, but it illustrates a problem that goes back as long as I can remember. We extragalactic astronomers spent all of the '90s obsessing over this problem. [I briefly thought that I had solved this problem, but I was wrong.] The observed luminosity function is nearly flat while the predicted halo mass function is steep. Consequently, there should be lots and lots of faint galaxies for every bright one, but instead there are relatively few. This discrepancy becomes progressively more severe to lower masses, with the predicted number of halos being off by a factor of many thousands for the faintest galaxies. The problem is most severe in the Local Group, where the faintest dwarf galaxies are known. Locally it is called the missing satellite problem, but this is just a special case of a more general problem that pervades the entire universe. Indeed, the small number of low mass objects is just one part of the problem. There are also too few galaxies at large masses. Even where the observed and predicted numbers come closest, around the scale of the Milky Way, they still miss by a large factor (this being a log-log plot, even small offsets are substantial). If we had assigned "explain the observed galaxy luminosity function" as a homework problem and the students had returned as an answer a line that had the wrong shape at both ends and at no point intersected the data, we would flunk them. This is, in effect, what theorists have been doing for the past thirty years. Rather than entertain the obvious interpretation that the theory is wrong, they offer more elaborate interpretations. Faced with the choice between changing one's mind and proving that there is no need to do so, almost everybody gets busy on the proof. J. K. Galbraith Theorists persist because this is what CDM predicts, with or without Λ, and we need cold dark matter for independent reasons. If we are unwilling to contemplate that ΛCDM might be wrong, then we are obliged to pound the square peg into the round hole, and bend the halo mass function into the observed luminosity function. This transformation is believed to take place as a result of a variety of complex feedback effects, all of which are real and few of which are likely to have the physical effects that are required to solve this problem. That's way beyond the scope of this post; all we need to know here is that this is the "physics" behind the transformation that leads to what is currently called Abundance Matching. Abundance matching boils down to drawing horizontal lines in the above figure, thus matching galaxies with dark matter halos with equal number density (abundance). So, just reading off the graph, a galaxy of stellar mass M* = 108 M☉ resides in a dark matter halo of 1011 M☉, one like the Milky Way with M* = 5 x 1010 M☉ resides in a 1012 M☉ halo, and a giant galaxy with M* = 1012 M☉ is the "central" galaxy of a cluster of galaxies with a halo mass of several 1014 M☉. And so on. In effect, we abandon the obvious and long-held assumption that the mass in stars should be simply proportional to that in dark matter, and replace it with a rolling fudge factor that maps what we see to what we predict. The rolling fudge factor that follows from abundance matching is called the stellar mass–halo mass relation. Many of the discussions of feedback effects in the literature amount to a post hoc justification for this multiplication of forms of feedback. This is a lengthy but insufficient introduction to a complicated subject. We wanted to get away from this, and test the halo mass function more directly. We do so by use of the velocity function rather than the stellar mass function. The velocity function is the number density of galaxies as a function of how fast they rotate. It is less widely used than the luminosity function, because there is less data: one needs to measure the rotation speed, which is harder to obtain than the luminosity. Nevertheless, it has been done, as with this measurement from the HIPASS survey: The number density of galaxies as a function of their rotation speed (Zwaan et al. 2010). The bottom panel shows the raw number of galaxies observed; the top panel shows the velocity function after correcting for the volume over which galaxies can be detected. Faint, slow rotators cannot be seen as far away as bright, fast rotators, so the latter are always over-represented in galaxy catalogs. The idea here is that the flat rotation speed is the hallmark of a dark matter halo, providing a dynamical constraint on its mass. This should make for a cleaner measurement of the halo mass function. This turns out to be true, but it isn't as clean as we'd like. Those of you who are paying attention will note that the velocity function Martin Zwaan measured has the same basic morphology as the stellar mass function: approximately flat at low masses, with a steep cut off at high masses. This looks no more like the halo mass function than the galaxy luminosity function did. So how does this help? To measure the velocity function, one has to use some readily obtained measure of the rotation speed like the line-width of the 21cm line. This, in itself, is not a very good measurement of the halo mass. So what Pengfei did was to fit dark matter halo models to galaxies of the SPARC sample for which we have good rotation curves. Thanks to the work of Federico Lelli, we also have an empirical relation between line-width and the flat rotation velocity. Together, these provide a connection between the line-width and halo mass: The relation Pengfei found between halo mass (M200) and line-width for the NFW (ΛCDM standard) halo model fit to rotation curves from the SPARC galaxy sample. Once we have the mass-line width relation, we can assign a halo mass to every galaxy in the HIPASS survey and recompute the distribution function. But now we have not the velocity function, but the halo mass function. We've skipped the conversion of light to stellar mass to total mass and used the dynamics to skip straight to the halo mass function: The halo mass function. The points are the data; these are well fit by a Schechter function (black line; this is commonly used for the galaxy luminosity function). The red line is the prediction of ΛCDM for dark matter halos. The observed mass function agrees with the predicted one! Test successful! Well, mostly. Let's think through the various aspects here. First, the normalization is about right. It does not have the offset seen in the first figure. As it should not – we've gone straight to the halo mass in this exercise, and not used the luminosity as an intermediary proxy. So that is a genuine success. It didn't have to work out this well, and would not do so in a very different cosmology (like SCDM). Second, it breaks down at high mass. The data shows the usual Schechter cut-off at high mass, while the predicted number of dark matter halos continues as an unabated power law. This might be OK if high mass dark matter halos contain little neutral hydrogen. If this is the case, they will be invisible to HIPASS, the 21cm survey on which this is based. One expects this, to a certain extent: the most massive galaxies tend to be gas-poor ellipticals. That helps, but only by shifting the turn-down to slightly higher mass. It is still there, so the discrepancy is not entirely cured. At some point, we're talking about large dark matter halos that are groups or even rich clusters of galaxies, not individual galaxies. Still, those have HI in them, so it is not like they're invisible. Worse, examining detailed simulations that include feedback effects, there do seem to be more predicted high-mass halos that should have been detected than actually are. This is a potential missing gas-rich galaxy problem at the high mass end where galaxies are easy to detect. However, the simulations currently available to us do not provide the information we need to clearly make this determination. They don't look right, so far as we can tell, but it isn't clear enough to make a definitive statement. Finally, the faint-end slope is about right. That's amazing. The problem we've struggled with for decades is that the observed slope is too flat. Here a steep slope just falls out. It agrees with the ΛCDM down to the lowest mass bin. If there is a missing satellite-type problem here, it is at lower masses than we probe. That sounds great, and it is. But before we get too excited, I hope you noticed that the velocity function from the same survey is flat like the luminosity function. So why is the halo mass function steep? When we fit rotation curves, we impose various priors. That's statistics talk for a way of keeping parameters within reasonable bounds. For example, we have a pretty good idea of what the mass-to-light ratio of a stellar population should be. We can therefore impose as a prior that the fit return something within the bounds of reason. One of the priors we imposed on the rotation curve fits was that they be consistent with the stellar mass-halo mass relation. Abundance matching is now part and parcel of ΛCDM, so it made sense to apply it as a prior. The total mass of a dark matter halo is an entirely notional quantity; rotation curves (and other tracers) pretty much never extend far enough to measure this. So abundance matching is great for imposing sense on a parameter that is otherwise ill-constrained. In this case, it means that what is driving the slope of the halo mass function is a prior that builds-in the right slope. That's not wrong, but neither is it an independent test. So while the observationally constrained halo mass function is consistent with the predictions of ΛCDM; we have not corroborated the prediction with independent data. What we really need at low mass is some way to constrain the total mass of small galaxies out to much larger radii that currently available. That will keep us busy for some time to come. By tritonstationin Dark Matter, LCDM January 26, 2020 2,115 Words46 Comments A personal recollection of how we learned to stop worrying and love the Lambda There is a tendency when teaching science to oversimplify its history for the sake of getting on with the science. How it came to be isn't necessary to learn it. But to do science requires a proper understanding of the process by which it came to be. The story taught to cosmology students seems to have become: we didn't believe in the cosmological constant (Λ), then in 1998 the Type Ia supernovae (SN) monitoring campaigns detected accelerated expansion, then all of a sudden we did believe in Λ. The actual history was, of course, rather more involved – to the point where this oversimplification verges on disingenuous. There were many observational indications of Λ that were essential in paving the way. Modern cosmology starts in the early 20th century with the recognition that the universe should be expanding or contracting – a theoretical inevitability of General Relativity that Einstein initially tried to dodge by inventing the cosmological constant – and is expanding in fact, as observationally established by Hubble and Slipher and many others since. The Big Bang was largely considered settled truth after the discovery of the existence of the cosmic microwave background (CMB) in 1964. The CMB held a puzzle, as it quickly was shown to be too smooth. The early universe was both isotropic and homogeneous. Too homogeneous. We couldn't detect the density variations that could grow into galaxies and other immense structures. Though such density variations are now well measured as temperature fluctuations that are statistically well described by the acoustic power spectrum, the starting point was that these fluctuations were a disappointing no-show. We should have been able to see them much sooner, unless something really weird was going on… That something weird was non-baryonic cold dark matter (CDM). For structure to grow, it needed the helping hand of the gravity of some unseen substance. Normal matter matter did not suffice. The most elegant cosmology, the Einstein-de Sitter universe, had a mass density Ωm= 1. But the measured abundances of the light elements were only consistent with the calculations of big bang nucleosynthesis if normal matter amounted to only 5% of Ωm = 1. This, plus the need to grow structure, led to the weird but seemingly unavoidable inference that the universe must be full of invisible dark matter. This dark matter needed to be some slow moving, massive particle that does not interact with light nor reside within the menagerie of particles present in the Standard Model of Particle Physics. CDM and early universe Inflation were established in the 1980s. Inflation gave a mechanism that drove the mass density to exactly one (elegant!), and CDM gave us hope for enough mass to get to that value. Together, they gave us the Standard CDM (SCDM) paradigm with Ωm = 1.000 and H0 = 50 km/s/Mpc. I was there when SCDM failed. It is hard to overstate the ferver with which the SCDM paradigm was believed. Inflation required that the mass density be exactly one; Ωm < 1 was inconceivable. For an Einstein-de Sitter universe to be old enough to contain the oldest stars, the Hubble constant had to be the lower of the two (50 or 100) commonly discussed at that time. That meant that H0 > 50 was Right Out. We didn't even discuss Λ. Λ was Unmentionable. Unclean. SCDM was Known, Khaleesi. Λ had attained unmentionable status in part because of its origin as Einstein's greatest blunder, and in part through its association with the debunked Steady State model. But serious mention of it creeps back into the literature by 1990. The first time I personally heard Λ mentioned as a serious scientific possibility was by Yoshii at a conference in 1993. Yoshii based his argument on a classic cosmological test, N(m) – the number of galaxies as a function of how faint they appeared. The deeper you look, the more you see, in a way that depends on the intrinsic luminosity of galaxies, and how they fill space. Look deep enough, and you begin to trace the geometry of the cosmos. At this time, one of the serious problems confronting the field was the faint blue galaxies problem. There were so many faint galaxies on the sky, it was incredibly difficult to explain them all. Yoshii made a simple argument. To get so many galaxies, we needed a big volume. The only way to do that in the context of the Robertson-Walker metric that describes the geometry of the universe is if we have a large cosmological constant, Λ. He was arguing for ΛCDM five years before the SN results. Lambda? We don't need no stinking Lambda! Yoshii was shouted down. NO! Galaxies evolve! We don't need no stinking Λ! In retrospect, Yoshii & Peterson (1995) looks like a good detection of Λ. Perhaps Yoshii & Peterson also deserve a Nobel prize? Indeed, there were many hints that Λ (or at least low Ωm) was needed, e.g., the baryon catastrophe in clusters, the power spectrum of IRAS galaxies, the early appearance of bound structures, the statistics of gravitational lenses, and so on. Certainly by the mid-90s it was clear that we were not going to make it to Ωm = 1. Inflation was threatened – it requires Ωm = 1 – or at least a flat geometry: Ωm+ΩΛ = 1. SCDM was in crisis. A very influential 1995 paper by Ostriker & Steinhardt did a lot to launch ΛCDM. I was impressed by the breadth of data Ostriker & Steinhardt discussed, all of which demanded low Ωm. I thought the case for Λ was less compelling, as it hinged on the age problem in a way that might also have been solved, at that time, by simply having an open universe (low Ωm with no Λ). This would ruin Inflation, but I wasn't bothered by that. I expect they were. Regardless, they definitely made that case for ΛCDM three years before the supernovae results. Their arguments were accepted by almost everyone who was paying attention, including myself. I heard Ostriker give a talk around this time during which he was asked "what cosmology are you assuming?" to which he replied "the right one." Called the "concordance" cosmology by Ostriker & Steinhardt, ΛCDM had already achieved the status of most-favored cosmology by the mid-90s. A simplified version of the diagram of Ostriker & Steinhardt (1995) illustrating just a few of the constraints they discussed. Direct measurements of the expansion rate, mass density, and ages of the oldest stars excluded SCDM, instead converging on a narrow window – what we now call ΛCDM. Ostriker & Steinhardt neglected to mention an important prediction of Λ: not only should the universe expand, but that expansion rate should accelerate! In 1995, that sounded completely absurd. People had looked for such an effect, and claimed not to see it. So I wrote a brief note pointing out the predicted acceleration of the expansion rate. I meant it in a bad way: how crazy would it be if the expansion of the universe was accelerating?! This was an obvious and inevitable consequence of ΛCDM that was largely being swept under the rug at that time. I mean[t], surely we could live with Ωm < 1 but no Λ. Can't we all just get along? Not really, as it turned out. I remember Mike Turner pushing the SN people very hard in Aspen in 1997 to Admit Λ. He had an obvious bias: as an Inflationary cosmologist, he had spent the previous decade castigating observers for repeatedly finding Ωm < 1. That's too little mass, you fools! Inflation demands Ωm = 1.000! Look harder! By 1997, Turner had, like many cosmologists, finally wrapped his head around the fact that we weren't going to find enough mass for Ωm = 1. This was a huge problem for Inflation. The only possible solution, albeit an ugly one, was if Λ made up the difference. So there he was at Aspen, pressuring the people who observed supernova to Admit Λ. One, in particular, was Richard Ellis, a great and accomplished astronomer who had led the charge in shouting down Yoshii. They didn't yet have enough data to Admit Λ. Not.Yet. By 1998, there were many more high redshift SNIa. Enough to see Λ. This time, after the long series of results only partially described above, we were intellectually prepared to accept it – unlike in 1993. Had the SN experiments been conducted five years earlier, and obtained exactly the same result, they would not have been awarded the Nobel prize. They would instead have been dismissed as a trick of astrophysics: the universe evolves, metallicity was lower at earlier times, that made SN then different from now, they evolve and so cannot be used as standard candles. This sounds silly now, as we've figured out how to calibrate for intrinsic variations in the luminosities of Type Ia SN, but that is absolutely how we would have reacted in 1993, and no amount of improvements in the method would have convinced us. This is exactly what we did with faint galaxy counts: galaxies evolve; you can't hope to understand that well enough to constrain cosmology. Do you ever hear them cited as evidence for Λ? Great as the supernovae experiments to measure the metric genuinely were, they were not a discovery so much as a confirmation of what cosmologists had already decided to believe. There was no singular discovery that changed the way we all thought. There was a steady drip, drip, drip of results pointing towards Λ all through the '90s – the age problem in which the oldest stars appeared to be older than the universe in which they reside, the early appearance of massive clusters and galaxies, the power spectrum of galaxies from redshift surveys that preceded Sloan, the statistics of gravitational lenses, and the repeated measurement of 1/4 < Ωm < 1/3 in a large variety of independent ways – just to name a few. By the mid-90's, SCDM was dead. We just refused to bury it until we could accept ΛCDM as a replacement. That was what the Type Ia SN results really provided: a fresh and dramatic reason to accept the accelerated expansion that we'd already come to terms with privately but had kept hidden in the closet. Note that the acoustic power spectrum of temperature fluctuations in the cosmic microwave background (as opposed to the mere existence of the highly uniform CMB) plays no role in this history. That's because temperature fluctuations hadn't yet been measured beyond their rudimentary detection by COBE. COBE demonstrated that temperature fluctuations did indeed exist (finally!) as they must, but precious little beyond that. Eventually, after the settling of much dust, COBE was recognized as one of many reasons why Ωm ≠ 1, but it was neither the most clear nor most convincing reason at that time. Now, in the 21st century, the acoustic power spectrum provides a great way to constrain what all the parameters of ΛCDM have to be, but it was a bit player in its development. The water there was carried by traditional observational cosmology using general purpose optical telescopes in a great variety of different ways, combined with a deep astrophysical understanding of how stars, galaxies, quasars and the whole menagerie of objects found in the sky work. All the vast knowledge incorporated in textbooks like those by Harrison, by Peebles, and by Peacock – knowledge that often seems to be lacking in scientists trained in the post-WMAP era. Despite being a late arrival, the CMB power spectrum measured in 2000 by Boomerang and 2003 by WMAP did one important new thing to corroborate the ΛCDM picture. The supernovae data didn't detect accelerated expansion so much as exclude the deceleration we had nominally expected. The data were also roughly consistent with a coasting universe (neither accelerating nor decelerating); the case for acceleration only became clear when we assumed that the geometry of the universe was flat (Ωm+ΩΛ = 1). That didn't have to work out, so it was a great success of the paradigm when the location of the first peak of the power spectrum appeared in exactly the right place for a flat FLRW geometry. The consistency of these data have given ΛCDM an air of invincibility among cosmologists. But a modern reconstruction of the Ostriker & Steinhardt diagram leaves zero room remaining – hence the tension between H0 = 73 measured directly and H0 = 67 from multiparameter CMB fits. Constraints from the acoustic power spectrum of the CMB overplotted on the direct measurements from the plot above. Initially in great consistency with those measurement, the best fit CMB values have steadily wandered away from the most-favored region of parameter space that established ΛCDM in the first place. This is most apparent in the tension with H0. In cosmology, we are accustomed to having to find our way through apparently conflicting data. The difference between an expansion rate of 67 and 73 seems trivial given that the field was long riven – in living memory – by the dispute between 50 and 100. This gives rise to the expectation that the current difference is just a matter of some subtle systematic error somewhere. That may well be correct. But it is also conceivable that FLRW is inadequate to describe the universe, and we have been driven to the objectively bizarre parameters of ΛCDM because it happens to be the best approximation that can be obtained to what is really going on when we insist on approximating it with FLRW. Though a logical possibility, that last sentence will likely drive many cosmologists to reach for their torches and pitchforks. Before killing the messenger, we should remember that we once endowed SCDM with the same absolute certainty we now attribute to ΛCDM. I was there, 3,000 internet years ago, when SCDM failed. There is nothing so sacred in ΛCDM that it can't suffer the same fate, as has every single cosmology ever devised by humanity. Today, we still lack definitive knowledge of either dark matter or dark energy. These add up to 95% of the mass-energy of the universe according to ΛCDM. These dark materials must exist. It is Known, Khaleesi. By tritonstationin Cosmology, LCDM, Sociology January 28, 2019 2,335 Words108 Comments Hypothesis testing with gas rich galaxies This Thanksgiving, I'd highlight something positive. Recently, Bob Sanders wrote a paper pointing out that gas rich galaxies are strong tests of MOND. The usual fit parameter, the stellar mass-to-light ratio, is effectively negligible when gas dominates. The MOND prediction follows straight from the gas distribution, for which there is no equivalent freedom. We understand the 21 cm spin-flip transition well enough to relate observed flux directly to gas mass. In any human endeavor, there are inevitably unsung heroes who carry enormous amounts of water but seem to get no credit for it. Sanders is one of those heroes when it comes to the missing mass problem. He was there at the beginning, and has a valuable perspective on how we got to where we are. I highly recommend his books, The Dark Matter Problem: A Historical Perspective and Deconstructing Cosmology. In bright spiral galaxies, stars are usually 80% or so of the mass, gas only 20% or less. But in many dwarf galaxies, the mass ratio is reversed. These are often low surface brightness and challenging to observe. But it is a worthwhile endeavor, as their rotation curve is predicted by MOND with extraordinarily little freedom. Though gas rich galaxies do indeed provide an excellent test of MOND, nothing in astronomy is perfectly clean. The stellar mass-to-light ratio is an irreducible need-to-know parameter. We also need to know the distance to each galaxy, as we do not measure the gas mass directly, but rather the flux of the 21 cm line. The gas mass scales with flux and the square of the distance (see equation 7E7), so to get the gas mass right, we must first get the distance right. We also need to know the inclination of a galaxy as projected on the sky in order to get the rotation to which we're fitting right, as the observed line of sight Doppler velocity is only sin(i) of the full, in-plane rotation speed. The 1/sin(i) correction becomes increasingly sensitive to errors as i approaches zero (face-on galaxies). The mass-to-light ratio is a physical fit parameter that tells us something meaningful about the amount of stellar mass that produces the observed light. In contrast, for our purposes here, distance and inclination are "nuisance" parameters. These nuisance parameters can be, and generally are, measured independently from mass modeling. However, these measurements have their own uncertainties, so one has to be careful about taking these measured values as-is. One of the powerful aspects of Bayesian analysis is the ability to account for these uncertainties to allow for the distance to be a bit off the measured value, so long as it is not too far off, as quantified by the measurement uncertainties. This is what current graduate student Pengfei Li did in Li et al. (2018). The constraints on MOND are so strong in gas rich galaxies that often the nuisance parameters cannot be ignored, even when they're well measured. To illustrate what I'm talking about, let's look at one famous example, DDO 154. This galaxy is over 90% gas. The stars (pictured above) just don't matter much. If the distance and inclination are known, the MOND prediction for the rotation curve follows directly. Here is an example of a MOND fit from a recent paper: The MOND fit to DDO 154 from Ren et al. (2018). The black points are the rotation curve data, the green line is the Newtonian expectation for the baryons, and the red line is their MOND fit. This is terrible! The MOND fit – essentially a parameter-free prediction – misses all of the data. MOND is falsified. If one is inclined to hate MOND, as many seem to be, then one stops here. No need to think further. If one is familiar with the ups and downs in the history of astronomy, one might not be so quick to dismiss it. Indeed, one might notice that the shape of the MOND prediction closely tracks the shape of the data. There's just a little difference in scale. That's kind of amazing for a theory that is wrong, especially when it is amplifying the green line to predict the red one: it needn't have come anywhere close. Here is the fit to the same galaxy using the same data [already] published in Li et al.: The MOND fit to DDO 154 from Li et al. (2018) using the same data as above, as tabulated in SPARC. Now we have a good fit, using the same data! How can this be so? I have not checked what Ren et al. did to obtain their MOND fits, but having done this exercise myself many times, I recognize the slight offset they find as a typical consequence of holding the nuisance parameters fixed. What if the measured distance is a little off? Distance estimates to DDO 154 in the literature range from 3.02 Mpc to 6.17 Mpc. The formally most accurate distance measurement is 4.04 ± 0.08 Mpc. In the fit shown here, we obtained 3.87 ± 0.16 Mpc. The error bars on these distances overlap, so they are the same number, to measurement accuracy. These data do not falsify MOND. They demonstrate that it is sensitive enough to tell the difference between 3.8 and 4.1 Mpc. One will never notice this from a dark matter fit. Ren et al. also make fits with self-interacting dark matter (SIDM). The nifty thing about SIDM is that it makes quasi-constant density cores in dark matter halos. Halos of this form are not predicted by "ordinary" cold dark matter (CDM), but often give better fits than either MOND of the NFW halos of dark matter-only CDM simulations. For this galaxy, Ren et al. obtain the following SIDM fit. The SIDM fit to DDO 154 from Ren et al. This is a great fit. Goes right through the data. That makes it better, right? Not necessarily. In addition to the mass-to-light ratio (and the nuisance parameters of distance and inclination), dark matter halo fits have [at least] two additional free parameters to describe the dark matter halo, such as its mass and core radius. These parameters are highly degenerate – one can obtain equally good fits for a range of mass-to-light ratios and core radii: one makes up for what the other misses. Parameter degeneracy of this sort is usually a sign that there is too much freedom in the model. In this case, the data are adequately described by one parameter (the MOND fit M*/L, not counting the nuisances in common), so using three (M*/L, Mhalo, Rcore) is just an exercise in fitting a French curve. There is ample freedom to fit the data. As a consequence, you'll never notice that one of the nuisance parameters might be a tiny bit off. In other words, you can fool a dark matter fit, but not MOND. Erwin de Blok and I demonstrated this 20 years ago. A common myth at that time was that "MOND is guaranteed to fit rotation curves." This seemed patently absurd to me, given how it works: once you stipulate the distribution of baryons, the rotation curve follows from a simple formula. If the two don't match, they don't match. There is no guarantee that it'll work. Instead, it can't be forced. As an illustration, Erwin and I tried to trick it. We took two galaxies that are identical in the Tully-Fisher plane (NGC 2403 and UGC 128) and swapped their mass distribution and rotation curve. These galaxies have the same total mass and the same flat velocity in the outer part of the rotation curve, but the detailed distribution of their baryons differs. If MOND can be fooled, this closely matched pair ought to do the trick. It does not. An attempt to fit MOND to a hybrid galaxy with the rotation curve of NGC 2403 and the baryon distribution of UGC 128. The mass-to-light ratio is driven to unphysical values (6 in solar units), but an acceptable fit is not obtained. Our failure to trick MOND should not surprise anyone who bothers to look at the math involved. There is a one-to-one relation between the distribution of the baryons and the resulting rotation curve. If there is a mismatch between them, a fit cannot be obtained. We also attempted to play this same trick on dark matter. The standard dark matter halo fitting function at the time was the pseudo-isothermal halo, which has a constant density core. It is very similar to the halos of SIDM and to the cored dark matter halos produced by baryonic feedback in some simulations. Indeed, that is the point of those efforts: they are trying to capture the success of cored dark matter halos in fitting rotation curve data. A fit to the hybrid galaxy with a cored (pseudo-isothermal) dark matter halo. A satisfactory fit is readily obtained. Dark matter halos with a quasi-constant density core do indeed provide good fits to rotation curves. Too good. They are easily fooled, because they have too many degrees of freedom. They will fit pretty much any plausible data that you throw at them. This is why the SIDM fit to DDO 154 failed to flag distance as a potential nuisance. It can't. You could double (or halve) the distance and still find a good fit. This is why parameter degeneracy is bad. You get lost in parameter space. Once lost there, it becomes impossible to distinguish between successful, physically meaningful fits and fitting epicycles. Astronomical data are always subject to improvement. For example, the THINGS project obtained excellent data for a sample of nearby galaxies. I made MOND fits to all the THINGS (and other) data for the MOND review Famaey & McGaugh (2012). Here's the residual diagram, which has been on my web page for many years: Residuals of MOND fits from Famaey & McGaugh (2012). These are, by and large, good fits. The residuals have a well defined peak centered on zero. DDO 154 was one of the THINGS galaxies; lets see what happens if we use those data. The rotation curve of DDO 154 from THINGS (points with error bars). The Newtonian expectation for stars is the green line; the gas is the blue line. The red line is the MOND prediction. Not that the gas greatly outweighs the stars beyond 1.5 kpc; the stellar mass-to-light ratio has extremely little leverage in this MOND fit. The first thing one is likely to notice is that the THINGS data are much better resolved than the previous generation used above. The first thing I noticed was that THINGS had assumed a distance of 4.3 Mpc. This was prior to the measurement of 4.04, so lets just start over from there. That gives the MOND prediction shown above. And it is a prediction. I haven't adjusted any parameters yet. The mass-to-light ratio is set to the mean I expect for a star forming stellar population, 0.5 in solar units in the Sptizer 3.6 micron band. D=4.04 Mpc and i=66 as tabulated by THINGS. The result is pretty good considering that no parameters have been harmed in the making of this plot. Nevertheless, MOND overshoots a bit at large radii. Constraining the inclinations for gas rich dwarf galaxies like DDO 154 is a bit of a nightmare. Literature values range from 20 to 70 degrees. Seriously. THINGS itself allows the inclination to vary with radius; 66 is just a typical value. Looking at the fit Pengfei obtained, i=61. Let's try that. MOND fit to the THINGS data for DDO 154 with the inclination adjusted to the value found by Li et al. (2018). The fit is now satisfactory. One tweak to the inclination, and we're done. This tweak isn't even a fit to these data; it was adopted from Pengfei's fit to the above data. This tweak to the inclination is comfortably within any plausible assessment of the uncertainty in this quantity. The change in sin(i) corresponds to a mere 4% in velocity. I could probably do a tiny bit better with further adjustment – I have left both the distance and the mass-to-light ratio fixed – but that would be a meaningless exercise in statistical masturbation. The result just falls out: no muss, no fuss. Hence the point Bob Sanders makes. Given the distribution of gas, the rotation curve follows. And it works, over and over and over, within the bounds of the uncertainties on the nuisance parameters. One cannot do the same exercise with dark matter. It has ample ability to fit rotation curve data, once those are provided, but zero power to predict it. If all had been well with ΛCDM, the rotation curves of these galaxies would look like NFW halos. Or any number of other permutations that have been discussed over the years. In contrast, MOND makes one unique prediction (that was not at all anticipated in dark matter), and that's what the data do. Out of the huge parameter space of plausible outcomes from the messy hierarchical formation of galaxies in ΛCDM, Nature picks the one that looks exactly like MOND. This outcome is illogical. It is a bad sign for a theory when it can only survive by mimicking its alternative. This is the case here: ΛCDM must imitate MOND. There are now many papers asserting that it can do just this, but none of those were written before the data were provided. Indeed, I consider it to be problematic that clever people can come with ways to imitate MOND with dark matter. What couldn't it imitate? If the data had all looked like technicolor space donkeys, we could probably find a way to make that so as well. Cosmologists will rush to say "microwave background!" I have some sympathy for that, because I do not know how to explain the microwave background in a MOND-like theory. At least I don't pretend to, even if I had more predictive success there than their entire community. But that would be a much longer post. For now, note that the situation is even worse for dark matter than I have so far made it sound. In many dwarf galaxies, the rotation velocity exceeds that attributable to the baryons (with Newton alone) at practically all radii. By a lot. DDO 154 is a very dark matter dominated galaxy. The baryons should have squat to say about the dynamics. And yet, all you need to know to predict the dynamics is the baryon distribution. The baryonic tail wags the dark matter dog. But wait, it gets better! If you look closely at the data, you will note a kink at about 1 kpc, another at 2, and yet another around 5 kpc. These kinks are apparent in both the rotation curve and the gas distribution. This is an example of Sancisi's Law: "For any feature in the luminosity profile there is a corresponding feature in the rotation curve and vice versa." This is a general rule, as Sancisi observed, but it makes no sense when the dark matter dominates. The features in the baryon distribution should not be reflected in the rotation curve. The observed baryons orbit in a disk with nearly circular orbits confined to the same plane. The dark matter moves on eccentric orbits oriented every which way to provide pressure support to a quasi-spherical halo. The baryonic and dark matter occupy very different regions of phase space, the six dimensional volume of position and momentum. The two are not strongly coupled, communicating only by the weak force of gravity in the standard CDM paradigm. One of the first lessons of galaxy dynamics is that galaxy disks are subject to a variety of instabilities that grow bars and spiral arms. These are driven by disk self-gravity. The same features do not appear in elliptical galaxies because they are pressure supported, 3D blobs. They don't have disks so they don't have disk self-gravity, much less the features that lead to the bumps and wiggles observed in rotation curves. Elliptical galaxies are a good visual analog for what dark matter halos are believed to be like. The orbits of dark matter particles are unable to sustain features like those seen in baryonic disks. They are featureless for the same reasons as elliptical galaxies. They don't have disks. A rotation curve dominated by a spherical dark matter halo should bear no trace of the features that are seen in the disk. And yet they're there, often enough for Sancisi to have remarked on it as a general rule. It gets worse still. One of the original motivations for invoking dark matter was to stabilize galactic disks: a purely Newtonian disk of stars is not a stable configuration, yet the universe is chock full of long-lived spiral galaxies. The cure was to place them in dark matter halos. The problem for dwarfs is that they have too much dark matter. The halo stabilizes disks by suppressing the formation of structures that stem from disk self-gravity. But you need some disk self-gravity to have the observed features. That can be tuned to work in bright spirals, but it fails in dwarfs because the halo is too massive. As a practical matter, there is no disk self-gravity in dwarfs – it is all halo, all the time. And yet, we do see such features. Not as strong as in big, bright spirals, but definitely present. Whenever someone tries to analyze this aspect of the problem, they inevitably come up with a requirement for more disk self-gravity in the form of unphysically high stellar mass-to-light ratios (something I predicted would happen). In contrast, this is entirely natural in MOND (see, e.g., Brada & Milgrom 1999 and Tiret & Combes 2008), where it is all disk self-gravity since there is no dark matter halo. The net upshot of all this is that it doesn't suffice to mimic the radial acceleration relation as many simulations now claim to do. That was not a natural part of CDM to begin with, but perhaps it can be done with smooth model galaxies. In most cases, such models lack the resolution to see the features seen in DDO 154 (and in NGC 1560 and in IC 2574, etc.) If they attain such resolution, they better not show such features, as that would violate some basic considerations. But then they wouldn't be able to describe this aspect of the data. Simulators by and large seem to remain sanguine that this will all work out. Perhaps I have become too cynical, but I recall hearing that 20 years ago. And 15. And ten… basically, they've always assured me that it will work out even though it never has. Maybe tomorrow will be different. Or would that be the definition of insanity? By tritonstationin Dark Matter, Data Interpretation, LCDM, MOND, Philosophy of Science, Rotation curves November 21, 2018 3,140 Words54 Comments It Must Be So. But which Must? In the last post, I noted some of the sociological overtones underpinning attitudes about dark matter and modified gravity theories. I didn't get as far as the more scientifically interesting part, which illustrates a common form of reasoning in physics. About modified gravity theories, Bertone & Tait state "the only way these theories can be reconciled with observations is by effectively, and very precisely, mimicking the behavior of cold dark matter on cosmological scales." Leaving aside just which observations need to be mimicked so precisely (I expect they mean power spectrum; perhaps they consider this to be so obvious that it need not be stated), this kind of reasoning is both common and powerful – and frequently correct. Indeed, this is exactly the attitude I expressed in my review a few years ago for the Canadian Journal of Physics, quoted in the image above. I get it. There are lots of positive things to be said for the standard cosmology. This upshot of this reasoning is, in effect, that "cosmology works so well that non-baryonic dark matter must exist." I have sympathy for this attitude, but I also remember many examples in the history of cosmology where it has gone badly wrong. There was a time, not so long ago, that the matter density had to be the critical value, and the Hubble constant had to be 50 km/s/Mpc. By and large, it is the same community that insisted on those falsehoods with great intensity that continues to insist on conventionally conceived cold dark matter with similarly fundamentalist insistence. I think it is an overstatement to say that the successes of cosmology (as we presently perceive them) prove the existence of dark matter. A more conservative statement is that the ΛCDM cosmology is correct if, and only if, dark matter exists. But does it? That's a separate question, which is why laboratory searches are so important – including null results. It was, after all, the null result of Michelson & Morley that ultimately put an end to the previous version of an invisible aetherial medium, and sparked a revolution in physics. Here I point out that the same reasoning asserted by Bertone & Tait as a slam dunk in favor of dark matter can just as accurately be asserted in favor of MOND. To directly paraphrase the above statement: "the only way ΛCDM can be reconciled with observations is by effectively, and very precisely, mimicking the behavior of MOND on galactic scales." This is a terrible problem for dark matter. Even if it were true, as is often asserted, that MOND only fits rotation curves, this would still be tantamount to a falsification of dark matter by the same reasoning applied by Bertone & Tait. Lets look at just one example, NGC 1560: The rotation curve of NGC 1560 (points) together with the Newtonian expectation (black line) and the MOND fit (blue line). Data from Begeman et al. (1991) and Gentile et al. (2010). MOND fits the details of this rotation curve in excruciating detail. It provides just the right amount of boost over the Newtonian expectation, which varies from galaxy to galaxy. Features in the baryon distribution are reflected in the rotation curve. That is required in MOND, but makes no sense in dark matter, where the excess velocity over the Newtonian expectation is attributed to a dynamically hot, dominant, quasi-spherical dark matter halo. Such entities cannot support the features commonly seen in thin, dynamically cold disks. Even if they could, there is no reason that features in the dominant dark matter halo should align with those in the disk: a sphere isn't a disk. In short, it is impossible to explain this with dark matter – to the extent that anything is ever impossible for the invisible. NGC 1560 is a famous case because it has such an obvious feature. It is common to dismiss this as some non-equilibrium fluke that should simply be ignored. That is always a dodgy path to tread, but might be OK if it were only this galaxy. But similar effects are seen over and over again, to the point that they earned an empirical moniker: Renzo's Rule. Renzo's rule is known to every serious student of rotation curves, but has not informed the development of most dark matter theory. Ignoring this information is like leaving money on the table. MOND fits not just NGC 1560, but very nearly* every galaxy we measure. It does so with excruciatingly little freedom. The only physical fit parameter is the stellar mass-to-light ratio. The gas fraction of NGC 1560 is 75%, so M*/L plays little role. We understand enough about stellar populations to have an idea what to expect; MOND fits return mass-to-light ratios that compare well with the normalization, color dependence, and band-pass dependent scatter expected from stellar population synthesis models. The mass-to-light ratio from MOND fits (points) in the blue (left panel) and near-infrared (right panel) pass-bands plotted against galaxy color (blue to the left, red to the right). From the perspective of stellar populations, one expects more scatter and a steeper color dependence in the blue band, as observed. The lines are stellar population models from Bell et al. (2003). These are completely independent, and have not been fit to the data in any way. One could hardly hope for better astrophysical agreement. One can also fit rotation curve data with dark matter halos. These require a minimum of three parameters to the one of MOND. In addition to M*/L, one also needs at least two parameters to describe the dark matter halo of each galaxy – typically some characteristic mass and radius. In practice, one finds that such fits are horribly degenerate: one can not cleanly constrain all three parameters, much less recover a sensible distribution of M*/L. One cannot construct the plot above simply by asking the data what it wants as one can with MOND. The "disk-halo degeneracy" in dark matter halo fits to rotation curves has been much discussed in the literature. Obsessed over, dismissed, revived, and ultimately ignored without satisfactory understanding. Well, duh. This approach uses three parameters per galaxy when it takes only one to describe the data. Degeneracy between the excess fit parameters is inevitable. From a probabilistic perspective, there is a huge volume of viable parameter space that could (and should) be occupied by galaxies composed of dark matter halos plus luminous galaxies. Two identical dark matter halos might host very different luminous galaxies, so would have rotation curves that differed with the baryonic component. Two similar looking galaxies might reside in rather different dark matter halos, again having rotation curves that differ. The probabilistic volume in MOND is much smaller. Absolutely tiny by comparison. There is exactly one and only one thing each rotation curve can do: what the particular distribution of baryons in each galaxy says it should do. This is what we observe in Nature. The only way ΛCDM can be reconciled with observations is by effectively, and very precisely, mimicking the behavior of MOND on galactic scales. There is a vast volume of parameter space that the rotation curves of galaxies could, in principle, inhabit. The naive expectation was exponential disks in NFW halos. Real galaxies don't look like that. They look like MOND. Magically, out of the vast parameter space available to galaxies in the dark matter picture, they only ever pick the tiny sub-volume that very precisely mimics MOND. The ratio of probabilities is huge. So many dark matter models are possible (and have been mooted over the years) that it is indefinably huge. The odds of observing MOND-like phenomenology in a ΛCDM universe is practically zero. This amounts to a practical falsification of dark matter. I've never said dark matter is falsified, because I don't think it is a falsifiable concept. It is like epicycles – you can always fudge it in some way. But at a practical level, it was falsified a long time ago. That is not to say MOND has to be right. That would be falling into the same logical trap that says ΛCDM has to be right. Obviously, both have virtues that must be incorporated into whatever the final answer may be. There are some efforts in this direction, but by and large this is not how science is being conducted at present. The standard script is to privilege those data that conform most closely to our confirmation bias, and pour scorn on any contradictory narrative. In my assessment, the probability of ultimate success through ignoring inconvenient data is practically zero. Unfortunately, that is the course upon which much of the field is currently set. *There are of course exceptions: no data are perfect, so even the right theory will get it wrong once in a while. The goof rate for MOND fits is about what I expect: rare, but more frequent for lower quality data. Misfits are sufficiently rare that to obsess over them is to refuse to see the forest for a few outlying trees. Here's a residual plot of MOND fits. See the peak at right? That's the forest. See the tiny tail to one side? That's an outlying tree. Residuals of MOND rotation curve fits from Famaey & McGaugh (2012). By tritonstationin LCDM, MOND, Philosophy of Science, Sociology October 5, 2018 1,558 Words73 Comments Dwarf Satellite Galaxies. III. The dwarfs of Andromeda Like the Milky Way, our nearest giant neighbor, Andromeda (aka M31), has several dozen dwarf satellite galaxies. A few of these were known and had measured velocity dispersions at the time of my work with Joe Wolf, as discussed previously. Also like the Milky Way, the number of known objects has grown rapidly in recent years – thanks in this case largely to the PAndAS survey. PAndAS imaged the area around M31 and M33, finding many individual red giant stars. These trace out the debris from interactions and mergers as small dwarfs are disrupted and consumed by their giant host. They also pointed up the existence of previously unknown dwarf satellites. The PAndAS survey field. Dwarf satellites are circled. As the PAndAS survey started reporting the discovery of new dwarf satellites around Andromeda, it occurred to me that this provided the opportunity to make genuine a priori predictions. These are the gold standard of the scientific method. We could use the observed luminosity and size of the newly discovered dwarfs to predict their velocity dispersions. I tried to do this for both ΛCDM and MOND. I will not discuss the ΛCDM case much, because it can't really be done. But it is worth understanding why this is. In ΛCDM, the velocity dispersion is determined by the dark matter halo. This has only a tenuous connection to the observed stars, so just knowing how big and bright a dwarf is doesn't provide much predictive power about the halo. This can be seen from this figure by Tollerud et al (2011): Virial mass of the dark matter halo as a function of galaxy luminosity. Dwarfs satellites reside in the wide colored band of low luminosities. This graph is obtained by relating the number density of galaxies (an observed quantity) to that of the dark matter halos in which they reside (a theoretical construct). It is highly non-linear, deviating strongly from the one-to-one line we expected early on. There is no reason to expect this particular relation; it is imposed on us by the fact that the observed luminosity function of galaxies is rather flat while the predicted halo mass function is steep. Nowadays, this is usually called the missing satellite problem, but this is a misnomer as it pervades the field. Addressing the missing satellites problem would be another long post, so lets just accept that the relation between mass and light has to follow something like that illustrated above. If a dwarf galaxy has a luminosity of a million suns, one can read off the graph that it should live in a dark halo with a mass of about 1010 M☉. One could use this to predict the velocity dispersion, but not very precisely, because there's a big range corresponding to that luminosity (the bands in the figure). It could be as much as 1011 M☉ or as little as 109 M☉. This corresponds to a wide range of velocity dispersions. This wide range is unavoidable because of the difference in the luminosity function and halo mass function. Small variations in one lead to big variations in the other, and some scatter in dark halo properties is unavoidable. Consequently, we only have a vague range of expected velocity dispersions in ΛCDM. In practice, we never make this prediction. Instead, we compare the observed velocity dispersion to the luminosity and say "gee, this galaxy has a lot of dark matter" or "hey, this one doesn't have much dark matter." There's no rigorously testable prior. In MOND, what you see is what you get. The velocity dispersion has to follow from the observed stellar mass. This is straightforward for isolated galaxies: M* ∝ σ4 – this is essentially the equivalent of the Tully-Fisher relation for pressure supported systems. If we can estimate the stellar mass from the observed luminosity, the predicted velocity dispersion follows. Many dwarf satellites are not isolated in the MONDian sense: they are subject to the external field effect (EFE) from their giant hosts. The over-under for whether the EFE applies is the point when the internal acceleration from all the stars of the dwarf on each other is equal to the external acceleration from orbiting the giant host. The amplitude of the discrepancy in MOND depends on how low the total acceleration is relative to the critical scale a0. The external field in effect adds some acceleration that wouldn't otherwise be there, making the discrepancy less than it would be for an isolated object. This means that two otherwise identical dwarfs may be predicted to have different velocity dispersions is they are or are not subject to the EFE. This is a unique prediction of MOND that has no analog in ΛCDM. It is straightforward to derive the equation to predict velocity dispersions in the extreme limits of isolated (aex ≪ ain < a0) or EFE dominated (ain ≪ aex < a0) objects. In reality, there are many objects for which ain ≈ aex, and no simple formula applies. In practice, we apply the formula that more nearly applies, and pray that this approximation is good enough. There are many other assumptions and approximations that must be made in any theory: that an object is spherical, isotropic, and in dynamical equilibrium. All of these must fail at some level, but it is the last one that is the most serious concern. In the case of the EFE, one must also make the approximation that the object is in equilibrium at the current level of the external field. That is never true, as both the amplitude and the vector of the external field vary as a dwarf orbits its host. But it might be an adequate approximation if this variation is slow. In the case of a circular orbit, only the vector varies. In general the orbits are not known, so we make the instantaneous approximation and once again pray that it is good enough. There is a fairly narrow window between where the EFE becomes important and where we slip into the regime of tidal disruption, but lets plow ahead and see how far we can get, bearing in mind that the EFE is a dynamical variable of which we only have a snapshot. To predict the velocity dispersion in the isolated case, all we need to know is the luminosity and a stellar mass-to-light ratio. Assuming the dwarfs of Andromeda to be old stellar populations, I adopted a V-band mass-to-light ratio of 2 give or take a factor of 2. That usually dominates the uncertainty, though the error in the distance can sometimes impact the luminosity at a level that impacts the prediction. To predict the velocity dispersion in the EFE case, we again need the stellar mass, but now also need to know the size of the stellar system and the intensity of the external field to which it is subject. The latter depends on the mass of the host galaxy and the distance from it to the dwarf. This latter quantity is somewhat fraught: it is straightforward to measure the projected distance on the sky, but we need the 3D distance – how far in front or behind each dwarf is as well as its projected distance from the host. This is often a considerable contributor to the error budget. Indeed, some dwarfs may be inferred to be in the EFE regime for the low end of the range of adopted stellar mass-to-light ratio, and the isolated regime for the high end. In this fashion, we predicted velocity dispersions for the dwarfs of Andromeda. We in this case were Milgrom and myself. I had never collaborated with him before, and prefer to remain independent. But I also wanted to be sure I got the details described above right. Though it wasn't much work to make the predictions once the preliminaries were established, it was time consuming to collect and vet the data. As we were writing the paper, velocity dispersion measurements started to appear. People like Michelle Collins, Erik Tollerud, and Nicolas Martin were making follow-up observations, and publishing velocity dispersion for the objects we were making predictions for. That was great, but they were too good – they were observing and publishing faster than we could write! Nevertheless, we managed to make and publish a priori predictions for 10 dwarfs before any observational measurements were published. We also made blind predictions for the other known dwarfs of Andromeda, and checked the predicted velocity dispersions against all measurements that we could find in the literature. Many of these predictions were quickly tested by on-going programs (i.e., people were out to measure velocity dispersions, whether we predicted them or not). Enough data rolled in that we were soon able to write a follow-up paper testing our predictions. Nailed it. Good data were soon available to test the predictions for 8 of the 10* a priori cases. All 8 were consistent with our predictions. I was particularly struck by the case of And XXVIII, which I had called out as perhaps the best test. It was isolated, so the messiness of the EFE didn't apply, and the uncertainties were low. Moreover, the predicted velocity dispersion was low – a good deal lower than broadly expected in ΛCDM: 4.3 km/s, with an uncertainty just under 1 km/s. Two independent observations were subsequently reported. One found 4.9 ± 1.6 km/s, the other 6.6 ± 2.1 km/s, both in good agreement within the uncertainties. We made further predictions in the second paper as people had continued to discover new dwarfs. These also came true. Here is a summary plot for all of the dwarfs of Andromeda: The velocity dispersions of the dwarf satellites of Andromeda. Each numbered box corresponds to one dwarf (x=1 is for And I and so on). Measured velocity dispersions have a number next to them that is the number of stars on which the measurement is based. MOND predictions are circles: green if isolated, open if the EFE applies. Points appear within each box in the order they appeared in the literature, from left to right. The vast majority of Andromeda's dwarfs are consistent with MOND (large green circles). Two cases are ambiguous (large yellow circles), having velocity dispersions based only a few stars. Only And V appears to be problematic (large red circle). MOND works well for And I, And II, And III, And VI, And VII, And IX, And X, And XI, And XII, And XIII, And XIV, And XV, And XVI, And XVII, And XVIII, And XIX, And XX, And XXI, And XXII, And XXIII, And XXIV, And XXV, And XXVIII, And XXIX, And XXXI, And XXXII, and And XXXIII. There is one problematic case: And V. I don't know what is going on there, but note that systematic errors frequently happen in astronomy. It'd be strange if there weren't at least one goofy case. Nevertheless, the failure of And V could be construed as a falsification of MOND. It ought to work in every single case. But recall the discussion of assumptions and uncertainties above. Is falsification really the story these data tell? We do have experience with various systematic errors. For example, we predicted that the isolated dwarfs spheroidal Cetus should have a velocity dispersion in MOND of 8.2 km/s. There was already a published measurement of 17 ± 2 km/s, so we reported that MOND was wrong in this case by over 3σ. Or at least we started to do so. Right before we submitted that paper, a new measurement appeared: 8.3 ± 1 km/s. This is an example of how the data can sometimes change by rather more than the formal error bars suggest is possible. In this case, I suspect the original observations lacked the spectral resolution to resolve the velocity dispersion. At any rate, the new measurement (8.3 km/s) was somewhat more consistent with our prediction (8.2 km/s). The same predictions cannot even be made in ΛCDM. The velocity data can always be fit once they are in hand. But there is no agreed method to predict the velocity dispersion of a dwarf from its observed luminosity. As discussed above, this should not even be possible: there is too much scatter in the halo mass-stellar mass relation at these low masses. An unsung predictive success of MOND absent from the graph above is And IV. When And IV was discovered in the general direction of Andromeda, it was assumed to be a new dwarf satellite – hence the name. Milgrom looked at the velocities reported for this object, and said it had to be a background galaxy. No way it could be a dwarf satellite – at least not in MOND. I see no reason why it couldn't have been in ΛCDM. It is absent from the graph above, because it was subsequently confirmed to be much farther away (7.2 Mpc vs. 750 kpc for Andromeda). The box for And XVII is empty because this system is manifestly out of equilibrium. It is more of a stellar stream than a dwarf, appearing as a smear in the PAndAS image rather than as a self-contained dwarf. I do not recall what the story with the other missing object (And VIII) is. While writing the follow-up paper, I also noticed that there were a number of Andromeda dwarfs that were photometrically indistinguishable: basically the same in terms of size and stellar mass. But some were isolated while others were subject to the EFE. MOND predicts that the EFE cases should have lower velocity dispersion than the isolated equivalents. The velocity dispersions of the dwarfs of Andromeda, highlighting photometrically matched pairs – dwarfs that should be indistinguishable, but aren't because of the EFE. And XXVIII (isolated) has a higher velocity dispersion than its near-twin And XVII (EFE). The same effect might be acting in And XVIII (isolated) and And XXV (EFE). This is clear if we accept the higher velocity dispersion measurement for And XVIII, but an independent measurements begs to differ. The former has more stars, so is probably more reliable, but we should be cautious. The effect is not clear in And XVI (isolated) and And XXI (EFE), but the difference in the prediction is small and the uncertainties are large. An aggressive person might argue that the pairs of dwarfs is a positive detection of the EFE. I don't think the data for the matched pairs warrant that, at least not yet. On the other hand, the appropriate use of the EFE was essential to all the predictions, not just the matched pairs. The positive detection of the EFE is important, as it is a unique prediction of MOND. I see no way to tune ΛCDM galaxy simulations to mimic this effect. Of course, there was a very recent time when it seemed impossible for them to mimic the isolated predictions of MOND. They claim to have come a long way in that regard. But that's what we're stuck with: tuning ΛCDM to make it look like MOND. This is why a priori predictions are important. There is ample flexibility to explain just about anything with dark matter. What we can't seem to do is predict the same things that MOND successfully predicts… predictions that are both quantitative and very specific. We're not arguing that dwarfs in general live in ~15 or 30 km/s halos, as we must in ΛCDM. In MOND we can say this dwarf will have this velocity dispersion and that dwarf will have that velocity dispersion. We can distinguish between 4.9 and 7.3 km/s. And we can do it over and over and over. I see no way to do the equivalent in ΛCDM, just as I see no way to explain the acoustic power spectrum of the CMB in MOND. This is not to say there are no problematic cases for MOND. Read, Walker, & Steger have recently highlighted the matched pair of Draco and Carina as an issue. And they are – though here I already have reason to suspect Draco is out of equilibrium, which makes it challenging to analyze. Whether it is actually out of equilibrium or not is a separate question. I am not thrilled that we are obliged to invoke non-equilibrium effects in both theories. But there is a difference. Brada & Milgrom provided a quantitative criterion to indicate when this was an issue before I ran into the problem. In ΛCDM, the low velocity dispersions of objects like And XIX, XXI, XXV and Crater 2 came as a complete surprise despite having been predicted by MOND. Tidal disruption was only invoked after the fact – and in an ad hoc fashion. There is no way to know in advance which dwarfs are affected, as there is no criterion equivalent to that of Brada. We just say "gee, that's a low velocity dispersion. Must have been disrupted." That might be true, but it gives no explanation for why MOND predicted it in the first place – which is to say, it isn't really an explanation at all. I still do not understand is why MOND gets any predictions right if ΛCDM is the universe we live in, let alone so many. Shouldn't happen. Makes no sense. If this doesn't confuse you, you are not thinking clearly. *The other two dwarfs were also measured, but with only 4 stars in one and 6 in the other. These are too few for a meaningful velocity dispersion measurement. By tritonstationin Dwarf satellite galaxies, LCDM, MOND, Philosophy of Science September 14, 2018 2,905 Words40 Comments Dwarf Satellite Galaxies. II. Non-equilibrium effects in ultrafaint dwarfs I have been wanting to write about dwarf satellites for a while, but there is so much to tell that I didn't think it would fit in one post. I was correct. Indeed, it was worse than I thought, because my own experience with low surface brightness (LSB) galaxies in the field is a necessary part of the context for my perspective on the dwarf satellites of the Local Group. These are very different beasts – satellites are pressure supported, gas poor objects in orbit around giant hosts, while field LSB galaxies are rotating, gas rich galaxies that are among the most isolated known. However, so far as their dynamics are concerned, they are linked by their low surface density. Where we left off with the dwarf satellites, circa 2000, Ursa Minor and Draco remained problematic for MOND, but the formal significance of these problems was not great. Fornax, which had seemed more problematic, was actually a predictive success: MOND returned a low mass-to-light ratio for Fornax because it was full of young stars. The other known satellites, Carina, Leo I, Leo II, Sculptor, and Sextans, were all consistent with MOND. The Sloan Digital Sky Survey resulted in an explosion in the number of satellites galaxies discovered around the Milky Way. These were both fainter and lower surface brightness than the classical dwarfs named above. Indeed, they were often invisible as objects in their own right, being recognized instead as groupings of individual stars that shared the same position in space and – critically – velocity. They weren't just in the same place, they were orbiting the Milky Way together. To give short shrift to a long story, these came to be known as ultrafaint dwarfs. Ultrafaint dwarf satellites have fewer than 100,000 stars. That's tiny for a stellar system. Sometimes they had only a few hundred. Most of those stars are too faint to see directly. Their existence is inferred from a handful of red giants that are actually observed. Where there are a few red giants orbiting together, there must be a source population of fainter stars. This is a good argument, and it is likely true in most cases. But the statistics we usually rely on become dodgy for such small numbers of stars: some of the ultrafaints that have been reported in the literature are probably false positives. I have no strong opinion on how many that might be, but I'd be really surprised if it were zero. Nevertheless, assuming the ultrafaints dwarfs are self-bound galaxies, we can ask the same questions as before. I was encouraged to do this by Joe Wolf, a clever grad student at UC Irvine. He had a new mass estimator for pressure supported dwarfs that we decided to apply to this problem. We used the Baryonic Tully-Fisher Relation (BTFR) as a reference, and looked at it every which-way. Most of the text is about conventional effects in the dark matter picture, and I encourage everyone to read the full paper. Here I'm gonna skip to the part about MOND, because that part seems to have been overlooked in more recent commentary on the subject. For starters, we found that the classical dwarfs fall along the extrapolation of the BTFR, but the ultrafaint dwarfs deviate from it. Fig. 1 from McGaugh & Wolf (2010, annotated). The BTFR defined by rotating galaxies (gray points) extrapolates well to the scale of the dwarf satellites of the Local Group (blue points are the classical dwarf satellites of the Milky Way; red points are satellites of Andromeda) but not to the ultrafaint dwarfs (green points). Two of the classical dwarfs also fall off of the BTFR: Draco and Ursa Minor. The deviation is not subtle, at least not in terms of mass. The ultrataints had characteristic circular velocities typical of systems 100 times their mass! But the BTFR is steep. In terms of velocity, the deviation is the difference between the 8 km/s typically observed, and the ~3 km/s needed to put them on the line. There are a large number of systematic effects errors that might arise, and all act to inflate the characteristic velocity. See the discussion in the paper if you're curious about such effects; for our purposes here we will assume that the data cannot simply be dismissed as the result of systematic errors, though one should bear in mind that they probably play a role at some level. Taken at face value, the ultrafaint dwarfs are a huge problem for MOND. An isolated system should fall exactly on the BTFR. These are not isolated systems, being very close to the Milky Way, so the external field effect (EFE) can cause deviations from the BTFR. However, these are predicted to make the characteristic internal velocities lower than the isolated case. This may in fact be relevant for the red points that deviate a bit in the plot above, but we'll return to that at some future point. The ultrafaints all deviate to velocities that are too high, the opposite of what the EFE predicts. The ultrafaints falsify MOND! When I saw this, all my original confirmation bias came flooding back. I had pursued this stupid theory to ever lower surface brightness and luminosity. Finally, I had found where it broke. I felt like Darth Vader in the original Star Wars: I have you now! The first draft of my paper with Joe included a resounding renunciation of MOND. No way could it escape this! I had this nagging feeling I was missing something. Darth should have looked over his shoulder. Should I? Surely I had missed nothing. Many people are unaware of the EFE, just as we had been unaware that Fornax contained young stars. But not me! I knew all that. Surely this was it. Nevertheless, the nagging feeling persisted. One part of it was sociological: if I said MOND was dead, it would be well and truly buried. But did it deserve to be? The scientific part of the nagging feeling was that maybe there had been some paper that addressed this, maybe a decade before… perhaps I'd better double check. Indeed, Brada & Milgrom (2000) had run numerical simulations of dwarf satellites orbiting around giant hosts. MOND is a nonlinear dynamical theory; not everything can be approximated analytically. When a dwarf satellite is close to its giant host, the external acceleration of the dwarf falling towards its host can exceed the internal acceleration of the stars in the dwarf orbiting each other – hence the EFE. But the EFE is not a static thing; it varies as the dwarf orbits about, becoming stronger on closer approach. At some point, this variation becomes to fast for the dwarf to remain in equilibrium. This is important, because the assumption of dynamical equilibrium underpins all these arguments. Without it, it is hard to know what to expect short of numerically simulating each individual dwarf. There is no reason to expect them to remain on the equilibrium BTFR. Brada & Milgrom suggested a measure to gauge the extent to which a dwarf might be out of equilibrium. It boils down to a matter of timescales. If the stars inside the dwarf have time to adjust to the changing external field, a quasi-static EFE approximation might suffice. So the figure of merit becomes the ratio of internal orbits per external orbit. If the stars inside a dwarf are swarming around many times for every time it completes an orbit around the host, then they have time to adjust. If the orbit of the dwarf around the host is as quick as the internal motions of the stars within the dwarf, not so much. At some point, a satellite becomes a collection of associated stars orbiting the host rather than a self-bound object in its own right. Deviations from the BTFR (left) and the isophotal shape of dwarfs (right) as a function of the number of internal orbits a star at the half-light radius makes for every orbit a dwarf makes around its giant host (Fig. 7 of McGaugh & Wolf 2010). Brada & Milgrom provide the formula to compute the ratio of orbits, shown in the figure above. The smaller the ratio, the less chance an object has to adjust, and the more subject it is to departures from equilibrium. Remarkably, the amplitude of deviation from the BTFR – the problem I could not understand initially – correlates with the ratio of orbits. The more susceptible a dwarf is to disequilibrium effects, the farther it deviated from the BTFR. This completely inverted the MOND interpretation. Instead of falsifying MOND, the data now appeared to corroborate the non-equilibrium prediction of Brada & Milgrom. The stronger the external influence, the more a dwarf deviated from the equilibrium expectation. In conventional terms, it appeared that the ultrafaints were subject to tidal stirring: their internal velocities were being pumped up by external influences. Indeed, the originally problematic cases, Draco and Ursa Minor, fall among the ultrafaint dwarfs in these terms. They can't be in equilibrium in MOND. If the ultrafaints are out of equilibrium, the might show some independent evidence of this. Stars should leak out, distorting the shape of the dwarf and forming tidal streams. Can we see this? A definite maybe: The shapes of some ultrafaint dwarfs. These objects are so diffuse that they are invisible on the sky; their shape is illustrated by contours or heavily smoothed grayscale pseudo-images. The dwarfs that are more subject to external influence tend to be more elliptical in shape. A pressure supported system in equilibrium need not be perfectly round, but one departing from equilibrium will tend to get stretched out. And indeed, many of the ultrafaints look Messed Up. I am not convinced that all this requires MOND. But it certainly doesn't falsify it. Tidal disruption can happen in the dark matter context, but it happens differently. The stars are buried deep inside protective cocoons of dark matter, and do not feel tidal effects much until most of the dark matter is stripped away. There is no reason to expect the MOND measure of external influence to apply (indeed, it should not), much less that it would correlate with indications of tidal disruption as seen above. This seems to have been missed by more recent papers on the subject. Indeed, Fattahi et al. (2018) have reconstructed very much the chain of thought I describe above. The last sentence of their abstract states "In many cases, the resulting velocity dispersions are inconsistent with the predictions from Modified Newtonian Dynamics, a result that poses a possibly insurmountable challenge to that scenario." This is exactly what I thought. (I have you now.) I was wrong. Fattahi et al. are wrong for the same reasons I was wrong. They are applying equilibrium reasoning to a non-equilibrium situation. Ironically, the main point of the their paper is that many systems can't be explained with dark matter, unless they are tidally stripped – i.e., the result of a non-equilibrium process. Oh, come on. If you invoke it in one dynamical theory, you might want to consider it in the other. To quote the last sentence of our abstract from 2010, "We identify a test to distinguish between the ΛCDM and MOND based on the orbits of the dwarf satellites of the Milky Way and how stars are lost from them." In ΛCDM, the sub-halos that contain dwarf satellites are expected to be on very eccentric orbits, with all the damage from tidal interactions with the host accruing during pericenter passage. In MOND, substantial damage may accrue along lower eccentricity orbits, leading to the expectation of more continuous disruption. Gaia is measuring proper motions for stars all over the sky. Some of these stars are in the dwarf satellites. This has made it possible to estimate orbits for the dwarfs, e.g., work by Amina Helmi (et al!) and Josh Simon. So far, the results are definitely mixed. There are more dwarfs on low eccentricity orbits than I had expected in ΛCDM, but there are still plenty that are on high eccentricity orbits, especially among the ultrafaints. Which dwarfs have been tidally affected by interactions with their hosts is far from clear. In short, reality is messy. It is going to take a long time to sort these matters out. These are early days. By tritonstationin Dark Matter, Data Interpretation, Dwarf satellite galaxies, LCDM, MOND September 12, 2018 September 14, 2018 2,080 Words27 Comments
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
7,148
Q: Flutter Error Message Bottom overloaded by 45 pixels I want to create a login screen using Flutter. This is my code so far: Future showInformationDialog(BuildContext context) { TextEditingController name = TextEditingController(); TextEditingController deadline = TextEditingController(); return showDialog( context: context, barrierDismissible: false, builder: (BuildContext context) { return AlertDialog( title: SingleChildScrollView( physics: NeverScrollableScrollPhysics(), child: Form( child: Column( children: <Widget>[ TextFormField( controller: name, maxLength: 40, textAlign: TextAlign.left, keyboardType: TextInputType.text, autocorrect: false, decoration: InputDecoration( labelText: 'Name der Klausur: ', border: OutlineInputBorder(), ), // The validator receives the text that the user has entered. validator: (value) { if (value.isEmpty) { return 'Gib den Namen der Klausur ein!'; } return null; }, ), SizedBox(height: 20), TextFormField( controller: deadline, maxLength: 8, textAlign: TextAlign.left, keyboardType: TextInputType.datetime, autocorrect: false, decoration: InputDecoration( labelText: 'Deadline: ', border: OutlineInputBorder(), ), // The validator receives the text that the user has entered. validator: (value) { if (value.isEmpty) { return 'Gib das Datum der Klausur ein!'; } return null; }, ), SizedBox(height: 20), DropDownFormField( titleText: 'Priorität', hintText: 'Bitte auswählen', value: '', dataSource: [ { "display": "Niedrig", "value": "Niedrig", }, { "display": "Mittel", "value": "Mittel", }, { "display": "Hoch", "value": "Hoch", }, ], textField: 'display', valueField: 'value', ), SizedBox(height: 20), ], ), ), ), actions: <Widget>[ FlatButton( onPressed: () { return showDialog( context: context, builder: (context) { return AlertDialog( content: Text(name.text), ); } ); }, child: Text('Save'), color: Colors.blue, ), FlatButton( onPressed: () { return showDialog( context: context, builder: (context) { return AlertDialog( content: Text(deadline.text), ); } ); }, child: Text('Save'), color: Colors.blue, ), ], ); }); } When the keyboard opens, it collides with the textfields -> I get an error: Bottom overflowed by 49 pixels. What could be the issue? I have tried everything but I got stuck here. SingleChildScrollView or resizeToAvoidBottomPadding: false didnt help me. Maybe I don't know how to use them correctly. For any help I would be happy. A: Is it me, or can't I find the code for your login screen? The error is thrown because there isn't enough place for your widget on the screen. Are you using a ListView or a Column? With a ListView you can scroll so if there isn't enough room for the content the user can scroll down to see what isn't on the screen.
{ "redpajama_set_name": "RedPajamaStackExchange" }
7,429
All Kia Approved Used cars come with an unrivalled peace-of-mind package as standard which is designed to provide you with as much reassurance as when buying a brand new car. Kia Approved Used cars are hand-picked according to stringent requirements. All are less than 18 months old and have covered less than 18,000 miles. So, rest assured your Kia Approved Used car is still in prime condition. Every Kia Approved Used car comes with a full service history, completed by qualified Kia Technicians and thoroughly valeted to the highest standards, making it look and feel like a brand new car. If for whatever reason you are not happy with your Approved Used Kia, you have up to 60 days or 1,000 miles to exchange your Approved Used Kia for another vehicle (subject to terms and conditions) This is twice as long as most other manufacturer's programmes. Cover is provided by the RAC for 1 year. In the unlikely event of any problems with your Approved Used Kia, you can be reassured with this level of cover which includes recovery at home or abroad, repatriation assistance, and help with onward travel costs. Kia will pay up to £500 per MOT test, to cover the cost of repair, when you return to a Kia franchised Dealer for your MOT test. All Kia Approved Used cars come with a 7 Year warranty from the day you drive away. Only Kia provide a Warranty this Long! Let Kia Insurance Services quote to insure your Approved Used Kia and while you are deciding they will provide free comprehensive Driveway Insurance cover, subject to eligibility. Cover is immediate so you can drive your new Approved Used Kia away there and then. No waiting and no hassle! Every Kia Approved Used car has been certified by Experian to be clear of major insurance claims for accident damage, and to be free of any owed finance – meaning you can buy without worry. All Kia Approved used cars have undertaken a multi-point check and are tested and certificated by a qualified Kia Technician before you drive away. A 24 hour service in the event of a vehicle accident. Call Kia first on 0330 102 8832 whoever you're insured by. We'll do the rest including liaising with your insurer and guaranteeing a repair at a Kia approved bodyshop.
{ "redpajama_set_name": "RedPajamaC4" }
1,655
BuilDatAnalytics and Carsquare Win Big at Ballston LaunchPad Finale Ronald Barba December 5th 2013 1:27 pm At what was initially intended as a pitch competition to decide on the one company to win a $15,000 prize, the audience at the Ballston LaunchPad Challenge Finale was astonished to hear that two startups had been chosen as winners: BuilDatAnalytics and Carsquare. BuilDatAnalytics is a business analytics and information management company specifically for the construction industry. With the company's software, they can customize a construction database for each project that provides up-to-date, real-time information to and from construction crews, solving the issues associated with working from outdated plans. The company is also looking to raise $2.5M in seed funding, with priority on hiring some key people who have already been of great help to the success of BuilDatAnalytics. "When you have a really good idea, people are willing to put in the work to help you," says founder Tiffany Hosey Brown. Carsquare (formerly known as iGrabber and featured at Tech Cocktail's DC Mixer & Startup Showcase in May) is a search platform that aggregates new, used, and leased vehicle listings from multiple auto sites. Basically, it's a Kayak for auto listings, compiling data from online dealer sites and third-party car discovery search engines into one place. Carsquare is raising $2M in seed funding. Whether intentionally aiming at a car-themed statement, founder Khurrum Shakir says earnestly, "It's been a long road, and I'm so thankful for [our] team and Ballston BID for getting us here." The LaunchPad Program provided 14 startup semi-finalists from all industries the opportunity to develop their ideas through education and mentorship from the Ballston Business Improvement District (BID) over the past six months, culminating in last night's final pitch competition. The aim? To find the best ideas to help grow Ballston's influence in the DC tech ecosystem. The judges at last night's event included Ted Leonsis, founder, chairman, and CEO of Monumental Sports & Entertainment; Aneesh Chopra, former assistant to the President and CTO of the US; Congressman Gerry Connolly; and Mark Gruhin, co-managing partner of Saul Ewing LLP's Washington Office. In making the announcement last night, Leonsis noted that the judging panel simply couldn't decide on just one winner from the four impressive companies that pitched (as well as jokingly referencing the time pressure of having to make such an important decision within mere minutes). Leonsis – who benevolently offered to contribute the additional $15,000 prize – was so impressed by the four finalists that he offered to meet with all four companies throughout the year to help them along the way. Each startup will receive $15,000, as well as legal services from Saul Ewing, office space from Intelligent Offices, and furniture from Washington Workplace – a prize package totaling approximately $30,000 for each company. Ballston LaunchPad winners, BuilDatAnalytics and Carsquare, take to the stage with the judges. Tech Cocktail is an media partner of the event. Salesforce Launches Customer Loyalty Management Feature Conor Cawley - 2 days ago WordPress Launches New Website Building Solution Latest Update Makes Zoho Projects 7 Brighter, Smarter and Simpler
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
3,078
How to Say NO Without Hurting People – Almost all of us face trouble while saying 'no'. And there are reasons for it. In our society, saying 'no' is a kind of insult to others. We are taught from a very young age to never say 'no' to authoritative figures, like our parents, teachers, relatives. bosses, etc. But why so? Often there's a fear of confrontation, fear of rejection, the feeling of guilt for hurting others, or probably you are afraid of being rude. But it is very important to say no when necessary. You have every right to protect your boundaries and maintain your own priorities and for that, you should say 'No'. Below we have discussed How to Say NO Without Hurting People. Check it out! When your plate is full of other responsibilities, it is natural that you won't be interested to take new requests. In such condition, explain your situation by saying "I would love to do it for you, but…" This will make the other person understand that currently you are engaged with tasks of higher priorities and they will hold off the current request for a while. If someone proposes a new idea to you that meets your interest, but you need to think on it for some time, you can say a temporary 'no', by saying "Let me think about it first and I will get back to you". It is important to think every possible aspects before saying 'yes' to something new. So take your time and if the person who put forth the request is serious, he/she will wait for your reply. When someone provides you any opportunity that doesn't meet your need at the moment, gently say 'no' with this statement "It doesn't meet my requirements now, but sure I will keep you in mind". This won't hurt the other person, instead he/she will understand that what he/she has to offer you is not bad, but you are saying 'no', because you are looking for something else. Saying "I will keep you in mind" will ensure that the door is always open for future opportunities. What will you do if someone asks you for help and you are afraid that you won't be able to do anything? Simply tell "I am afraid that I am not the right person to help you on this, Why don't you try Mr/Ms X?" This will avoid any kind of hard feeling, because even if you are saying that you can't help, but at the same time you are letting him/her know the person who will be able to help. The trick is to say 'no' wisely. Here are some one-liner that will come handy when you don't want to pronounce the dirty word. I wish I could make it work. I won't be able to dedicate the time I need to it. Unfortunately its not a good time. I am honored, but can't. I really don't think I would be able to do justice to it. Saying 'no' is difficult, but you cannot always say 'yes' to everything and everybody. Learn How to Say NO Without Hurting People. You will find that it's not as bad as you had imagined it to be. Apart from the listed ways, there are many other ways to say 'no'. If you have your own style, do share it with us in the comment section below. Very useful article. I'll keep the points in mind.
{ "redpajama_set_name": "RedPajamaC4" }
8,390
New Delhi: Vodafone Business Services (VBS), the enterprise arm of Vodafone India, has launched International Toll Free Services (ITFS) to meet the need of Indian enterprises to provide a convenient and free method to their customers, business partners and employees travelling abroad – to communicate with them, through an international toll free number. The noteworthy benefit of this service for Indian enterprises would be that billing is done in Indian rupees instead in foreign currency, which is the case when calls are routed through international operators. Besides, Vodafone will be providing this service to multiple countries by eliminating the need for Enterprises to talk to multiple service providers. Vodafone is all set to tap the potential business opportunities in the INR 70 crore (Source F&S) ITFS market and empower the enterprises across sectors – cater to their international customers, call support centres in India or conference bridges hosted in India. Vodafone will work closely with these enterprises to understand their business needs and provide suitable ITFS number solutions. Vodafone customers can directly avail the Toll Free Number of different countries effortlessly. As an added benefit, customers need not connect with the foreign operators and neither do they have to worry about hauling the call back to India. ITFS is a part of the wireline suite of offering customized solutions for Indian enterprises, over and above other services offered by Vodafone Business Services under wireline portfolio, which include Private Leased Circuits (National and International), MPLS based Virtual Private Network (VPN), Internet Leased Lines, Office Wireline Voice and Audio Conferencing Services. The bouquet of integrated wireline services for businesses aims to ensure uninterrupted and secured communication and will especially empower Indian enterprises to keep pace with the evolution in digital communication. It is good to know that Vodafone has this free tool free service to cater the concerns of their clients or customers from India and those from international clients. Having this kind of service will help them addressed the queries and problems of their clients without taking their time to go to their office or service center.
{ "redpajama_set_name": "RedPajamaC4" }
8,048
using System; using System.Linq; using Bytes2you.DataAccess.Data; namespace Bytes2you.DataAccess.EntityFramework.UnitTests.Testing.Mocks { public class PersonDataEntityMock : IDataEntity<int> { public int Id { get; set; } } }
{ "redpajama_set_name": "RedPajamaGithub" }
6,562
The JPK Project has been formed to provide a much needed Supported Living Centre for people with a learning disability in Eastbourne and the surrounding areas. You can make a difference by donating to the Project. By registering with easyfundraising.org.uk and selecting the JPK Project as the cause you would like to support you can donate as you shop online with no cost to you. There are over 3000 retailers who give a commission easyfundraising.org.uk a commission every time you purchase which gets turned into a donation and given to the cause you've selected.
{ "redpajama_set_name": "RedPajamaC4" }
3,485
Free Arabic Mac Fonts Mac Ttf Rel Photo Mac Paint Mac Free Mac Ascii Text Editor Mac Os X Unmask Hidden Password Mac Mac Mac Free Zip Unzipping For Mac Mac Audio Level Meter Mac Free Mac Intelli Studio Mac HaXe for Mac OS X v.2.07 haXe (pronounced as hex) is an open source programming language. While most other languages are bound to their own platform (Java to the JVM, C# to .Net, ActionScript to the Flash Player), haXe is a multiplatform language. It means that you can use haXe to target the following platforms : Javascript : You can compile a haXe program to a single .js file. You can access the typed browser DOM APIs with autocompletion support, and all the dependencies will be resolved at compilation time. Flash : You can compile a haXe program to a .swf file. haXe is compatible with Flash Players 6 to 10, with either 'old' Flash 8 API or newest AS3/Flash9+ API. haXe offers very good performance and language features to develop Flash content. NekoVM : You can compile a haXe program to NekoVM bytecode. This can be used for server-side programming such as dynamic webpages (using mod_neko for Apache) and also for command-line or desktop applications, since NekoVM can be embedded and extended with some other DLL. PHP : You can compile a haXe program to .php files. This will enable you to use a high level strictly-typed language such as haXe while keeping full compatibility with your existing server platform and libraries. C++ : You can now generate C++ code from your haXe source code, with the required Makefiles. This is very useful for creating native applications, for instance in iPhone development. C# and Java targets are coming soon! (from @cwaneck) The idea behind haXe is to let the developer choose the best platform for a given job. In general, this is not easy to do, because every new platform comes with its own programming language. What haXe provides you with is: a standardized language with many good features a standard library (including Date, Xml, Math...) that works the same on all platforms platform-specific libraries : the full APIs for a given platform are accessible from haXe · The haXe Syntax is similar to the one used in Java / JavaScript / ActionScript, so it's very easy to learn and get used to haXe. haXe can also integrate easily in your favorite editor or IDE · The haXe Type System is strictly typed, enabling the compiler to detect most errors at compile-time. At the same time, it's very different from classic strictly typed languages since you don't have to write types everywhere in your program, thanks to type inference. It makes you feel like you are programming in a dynamically typed language while you are getting the security of a strictly typed language. The best of both worlds · The haXe Standard Library, including Date, XML, data structures... is specified to behave the same across all platforms. This enables you to share and reuse code logic between different platforms without needing to rewrite the same thing again and again · haXe is easily Extensible : you can add additional libraries and wrappers for platform-specific features. Thanks to conditional compilation you can have different implementation for a given API depending on the platform you want to compile to · haXe has a Remoting library that provides cross platform serialization and RPC, enabling you to directly call methods between platforms. For example calls may be made between the client and the server or between different client modules. Everything is handled transparently · haXe can be used to develop portable Desktop applications by using SWHX haXe (pronounced as hex) is an open source ... haXe (pronounced as hex) is an open source programming language. While most other languages are bound to their own platform (Java to the JVM, C# to .Net, ActionScript to the Flash Player), haXe is a multiplatform language. It means that you can ... programming language, programming compiler, programming component, compiler, programmer, development File Name: HaXe for Mac OS X Date: 27 Apr 12 Author: Nicolas Cannasse Runs on: Mac OS X Review HaXe for Mac OS X Other software of Nicolas Cannasse haXe v.2.07haXe (pronounced as hex) is an open source programming language. While most other languages are bound to their own platform (Java to the JVM, C# to .Net, ActionScript to the Flash Player), haXe is a multiplatform language. It means that you can ... haXe for Linux v.2.07haXe (pronounced as hex) is an open source programming language. While most other languages are bound to their own platform (Java to the JVM, C# to .Net, ActionScript to the Flash Player), haXe is a multiplatform language. It means that you can ... New Miscellaneous software The .Net PDF Library v.2020.10.3.3 A .NET PDF library such as Iron PDF is a software library for C#, VB.NET, and other .NET Framework developers to work with PDFs generating PDFs, editing PDFs, and extracting PDF content within their applications. Supports .Net Core & .Net Framework. ASP. NET PDF Generator v.2020.8.1 The primary function of Iron PDF for ASP.NET is the generation of PDF documents using HTML to PDF technology. In addition, Iron PDF can be used to sign PDFs, edit existing PDFs, and extract content such as text and images from uploaded PDF documents. OCR in .Net v.4 Iron OCR is a OCR .NET library allowing users to convert images and PDF documents back into text using the .NET Framework in C#, F#, or VB.NET. VB.Net to C# Converter v.5.08 VBConversions has the most accurate VB.Net to C# Converter money can buy. The VBConversions VB to C# Converter won Visual Studio Magazine Reader's Choice Award, 2006. Free download. FreeBasic for Windows v.0.23.0Elegance, Power, Speed; Choose all three. FreeBASIC is a free/open source (GPL), 32-bit BASIC compiler for Microsoft Windows and Linux. When used in its 'QB' language mode, FreeBASIC provides a high level of support for programs written for ... OoRexx for Linux v.4.1.0.6441Open Object Rexx (ooRexx) is an Open Source project managed by Rexx Language Association (RexxLA) providing a free implementation of Object Rexx. Open Object Rexx includes features typical of an object-oriented language, such as subclassing, ...
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
3,023
The process of Choosing and buying the health insurance is not an easy task because market is flooded with variety of insurance offerings. Health Insurance is a vital part of living healthy financial life because out of pocket sudden medical expenses may cause financial disaster. Before you start purchasing health insurance, identify your need (what kind of coverage you want), know the exclusions and limitations, check for the network hospitals for cashless claims, room rent capping etc. In addition, do not forget to compare different health insurance offerings. Every health insurance comes with certain features, which make it different from others. You need to be very careful to check which fits your needs. Don't Compare Premiums: Never start the health insurance comparison with the insurance premium. Health insurance is a complex plan tightly coupled with the complex services that are necessary to understand. Comparing healthcare insurance product requires deeper insights into the overall insurance contract over and above price comparison. Either you need to get yourself into comparing the features in detail, or take help of an unbiased health insurance adviser. Sum assured/Top up plans: The amount of money that any insurance company ensures to pay up to the policyholder is the sum assured. Find out what is the best sum assured amount for you, as medical cost increases day by day in busy hospitals in India. The value if 2-3 lakhs have no value in metros. Top up plans offers you additional coverage at low cost. These plans cover over & above the actual health insurance plan hence increase the sum assured. How premium Changes: there are many factors which defines the premium like your age band, number of covered members, pre-existing disease, tobacco product/ alcohol consumption, your lifestyle etc. We always recommend to compare the products before making decision. Cumulative Bonus: As per the IRDA norms, each claim free year the policy holder gets a benefit knows as cumulative bonus. A specified percentage of Sum Assured in increased every year only if the policy is renewed without any break. Ensure that you have a good adviser: Find a good health insurance adviser, which gives you unbiased information, without having any influence from any health insurer.
{ "redpajama_set_name": "RedPajamaC4" }
8,336
Interesting Facts About LDS Temples in Utah Don't you just love reading interesting facts about LDS temples? In this post you're going to learn every temple fact I could find about each of the 17 LDS temples in Utah. Pretty soon, we'll have new temples in Orem, Taylorsville, Layton, Saratoga Springs, Tooele, and Washington County. Once these have been completed, I'll update this post with any interesting facts. Because this article is extremely long, I've provided you with a table of contents. Clicking on a temple will take you directly to that part of the page that talks about it. LDS Temples in Utah #1 – Bountiful Utah Temple Bountiful LDS Temple Facts Dedicated on 1/8/95 by President Howard W. Hunter. This temple had 28 dedicatory sessions that were attended by 201,655 people. This is the most people that have ever been to a temple dedication. source The Bountiful Utah Temple was the first temple built in Davis County, Utah and the 8th built in Utah. The floor plan was also used for the Mount Timpanogos Temple. It has different spires though. President Howard W. Hunter only dedicated two temples during the short time that he was the president of the Church. Those two temples were the Bountiful Utah Temple and the Orlando Florida Temple. 7,500 people went to the groundbreaking ceremony for the Bountiful Utah Temple. Some people had to sit on the hillside because of how many people were there. Another 2,500 people viewed the ceremony from remote video. At the groundbreaking of the Bountiful Utah Temple, President Gordon B. Hinckley implied that there would be temples coming soon to Hong Kong China, Preston England, and Hartford Connecticut. He said that nothing would be announced officially until they had sites for them though. This temple was open for six weeks for the open house. 870,361 people came and walked through it. 45,000 people volunteered to run the open house. Angel Moroni was hit by lightning twice on May 22, 2016. Some of the gold leaf was cracked and pieces of the face and back of the fiberglass were removed. It was replaced on June 1, 2016. source You can see that here. Bethel white granite was used for the exterior facade of the Bountiful Utah Temple. It was from Sharon, Vermont which is where the Prophet Joseph Smith was born. source Brigham City Utah Temple Brigham City LDS Temple Facts Dedicated on 9/23/12 by Boyd K. Packer, who, along with his wife, is a Brigham City native. The walls in the sealing rooms have Brigham City's peach blossoms on them. source Fourteenth temple built in Utah Angel Moroni Statue: 12 feet tall source Cedar City Utah Temple Cedar City LDS Temple Facts Dedicated on 12/10/17 by Henry B. Eyring. 17th temple in Utah. 159th temple worldwide. This temple has Presbyterian architecture, which was a style that the pioneers brought across the plains. An example of this is the cupola bearing angel Moroni. source There are two historic stained glass windows that are from the old Astoria Presbyterian Church in Queens, New York. source Brigham Young sent descendants of pioneer settlers south in the 1850s to mine iron deposits in the hills of what would become Iron County. President Eyring said that this temple is a great tribute to the wonderful pioneers who came here. source Dr. Richard Saunders, Dean of Library Services at SUU, said, "This temple includes design elements that reflect the community's physical setting and culture. For instance, juniper branches are painted into ceiling corners, and the overall style reflects the straight, simple lines, the color schemes, and décor common to early homes in Iron County. The church was careful to design and decorate a building that reflects the heritage of the permanent settlers to this region." source Joel Hill Johnson wrote the temple hymn High on a Mountain Top while he was living in Enoch, Utah which is a suburb of Cedar City, Utah. source Related Content: 10 Elegant LDS Temple Dresses You'll Love Wearing Draper Utah Temple Draper LDS Temple Facts Dedicated on 3/20/09 by Thomas S. Monson. The Draper Utah temple has the largest sealing room and can seat up to 80 people. The Utah state flower, the Sego Lily, is a theme in the Draper Utah Temple. They are on the many panes of stained glass windows. Tom Holdman crafted these windows and they survived a fire in his art studio that ruined the entire structure. source Jordan River Utah Temple Jordan River LDS Temple Facts Dedicated by President Marion G. Romney 11/16/81 7th temple built in Utah. 20th temple worldwide. Went through extensive renovation in February of 2016. Only temple dedicated by President Marion G. Romney. source There are member who attended this temple's dedication who also attended the dedication of the Salt Lake Temple nearly 100 years earlier. This temple has the tallest angel Moroni statue, which is 20 feet tall. The Jordan River temple is one of five temples to feature an Angel Moroni statue holding the plates. source The Oquirrh Mountain Temple and this temple were the first LDS temples in Utah built in the same city, which is South Jordan. The Jordan River Utah Temple is the highest capacity temple in the Church with six ordinance rooms each seating 122 patrons. source 16 sealing rooms. This temple was originally named the Jordan River Temple, but is now the Jordan River Utah Temple. For a long time the construction and maintenance of this temple was funded solely by donations from the local members. source There is a theme of tear drops which is seen on the spire and fence. source Related Content: A Complete List of Famous Mormon Women Logan Utah Temple Logan LDS Temple Facts Dedicated on 5/17/1884 by President John Taylor. Does not have an angel Moroni 2nd temple built in Utah. Originally named the Logan Temple. Only temple dedicated by President John Taylor. Interior was completely redone and President Spencer W. Kimball regretted the need to rebuild the interior because of the loss of pioneer craftsmanship. Dedicated while Utah was a territory, before Statehood was granted, which was 1/4/1896. This temple was the first one to be built where you go room to room for the endowment. source The outside walls used to be a pink/off-white color to cover the rough-hewn, dark limestone. The paint started coming off in the early 1900s and now you can see the bare stone with no paint. There was a fire on December 4, 1917 in the Logan Utah Temple. It was in the southeast staircase and ruined many windows and paintings. It also caused a lot of water damage and smoke damage. The fire started because of some electrical wiring. source There was renovation in 1976 in the ordinance rooms. It got new wallpaper that works for the motion-picture endowment presentation. Heated sidewalks were added in 2009 to melt the snow. "As completion of the temple neared, women in the area were asked to make carpets for the temple, since commercially made carpet could not be bought in Utah at that time. The women spent two months working to hand make 2,144 square yards of carpet." source This temple was built completely by volunteers for 7 years from 1877 to 1884. It has five stories. source Manti Utah Temple Manti LDS Temple Facts Privately dedicated on 5/17/1888 by Wilford Woodruff and publicly dedicated by President Lorenzo Snow on 5/21/1888. 3rd temple built in Utah. Originally named the Manti Temple This temple was built on the Manti Stone Quarry, which was infested with rattlesnakes. source Only temple dedicated by President Lorenzo Snow. Dedicated while Utah was a territory, before Statehood was granted. Minerva Teichert painted the mural in the World Room during a renovation in the 1940's. The Manti Utah Temple is one of two LDS temples in Utah that still employs live acting for presentation of the endowment. (The other is the Salt Lake Temple.) Individuals progress through these rooms: Creation Room, Garden Room, World Room, Terrestrial Room, and Celestial Room. source "The two spiral staircases in the Manti Temple are a marvel of design and craftsman ingenuity. It is one of less than a dozen free-standing spiral staircases in the United States. Built in the 1870-80's, by skilled pioneer craftsman, primarily from Scandinavia, the staircases has 151 steps that rise 76 vertical feet with no center support. The spiral staircase on the North circles clockwise and the one on the South circles counterclockwise." source Brigham Young said, "Here is the spot where the Prophet Moroni stood and dedicated this piece of land for a Temple site, and that is the reason why the location is made here, and we can't move it from this spot." source "The Scandinavian background of the workers was obvious in the construction techniques they used. In one part of the temple, a Norwegian boat builder was in charge of designing the ceiling. He had never built a large building before and was not sure how to go about it, so he simply used the design of a boat and turned it upside down." This temple used to be the location for the Holy of Holies before the dedication of the Salt Lake Temple. After that, the room became a sealing room until the late 1970s when it closed. source Until the 1960s, there used to be a tunnel under the east tower. Wagons and cars would go through it. The people of Manti used to joke around and say that the Manti Temple is the only one you don't need a recommend to go through. source In 1928 the east tower was struck by lightning. The fire burned for three hours before it was extinguished. source Monticello Utah Temple Monticello LDS Temple Facts Dedicated on 7/26/98 by Gordon B. Hinckley. Gordon B. Hinckley introduced the idea of a smaller temple and the first one was in Monticello Utah (1998). At first it had an all-white Angel Moroni, but it was changed to a gold one because the white was very hard to see. 11th temple built in Utah. 53rd temple worldwide. This was the quickest temple ever built. The construction took eight months and nine days. source This is one of the smallest temples and only has two ordinance rooms and two sealing rooms. Related Content: Teaching Modesty To Children: 14 Latter-Day Saint Moms Weigh In Mount Timpanogos Utah Temple Mount Timpanogos LDS Temple Facts Dedicated on 10/13/96 by Gordon B. Hinckley. Gordon B. Hinckley said that this temple was built to relieve the heavy demands placed on the Provo Utah Temple, which was operating far beyond its designed capacity. At the groundbreaking ceremony, the location of the Madrid Spain Temple was announced. The site where the Mount Timpanogos Utah Temple was built used to be a Church welfare farm. source The floor plan was adapted from the Bountiful, Utah temple plan. Other than the spires, the two LDS temples in Utah are nearly identical. source This temple features keystones, star stones, sun stones, moon stones, and earth stones. Ogden Utah Temple Ogden LDS Temple Facts Dedicated on 1/18/72 by President Joseph Fielding Smith. President Monson rededicated the updated temple on 9/21/14. First temple dedicated in Utah after Utah became a state. First temple built with 6 ordinance rooms. source "At the dedication of the Ogden Utah Temple, President Harold B. Lee finished the remaining one-third of the dedicatory prayer when President Joseph Fielding Smith became too weak from standing so long." source The original Ogden Utah Temple used to look like the Provo Utah Temple before it was redone. See the pictures of the temple before and after it was remodeled here. When this temple was dedicated, it was the 14th temple. Oquirrh Mountain Utah Temple Oquirrh Mountain LDS Temple Facts "Oquirrh" — Goshute Indian word meaning "wooded mountain" or "shining mountains" (pronounced "O-ker"). source "It is named Oquirrh Mountain because it was built on the base of the Oquirrh Mountain range. This mountain range is on the west side of the Salt Lake Valley." source "The beautiful centerpiece of the temple's celestial room is a 15-foot-long chandelier made of nearly 20,000 pieces of Swarovski crystal." source From this temple you can also see three other LDS temples in Utah. Those three are Jordan River, Draper, and the Salt Lake Temple. President Monson dedicated the temple on his 82nd birthday and the crowd sang him happy birthday. source Church was canceled in Utah on August 23, 2009 so that members could go to this temple's dedication without having anything get in the way. This was the first time this happened in Utah. source In June 2009, lightning struck the Angel Moroni statue atop the Oquirrh Mountain Utah Temple, damaging Moroni's trumpet, arm and face. A replacement statue was installed on Aug. 11, 2009, 10 days before the dedicatory services began, according to the Deseret News. This temple was the first of the LDS temples in Utah to be built in the same city as another temple, which is the Jordan River Temple in South Jordan, Utah. Payson Utah Temple Payson LDS Temple Facts Dedicated on 6/7/15 by Henry B. Eyring. Geniel Pino, an 81 year old widow, and many other women crocheted and donated beautiful altar cloths for the interior of the temple. source 3rd temple built in Utah County. Payson and other nearby towns were where Ute Indians used to live. source A member donated the land to the church for them to build the temple on. There are flowers on the stained glass windows. At the bottom, they are buds and they turn into full blossoms as you look higher up. There are apple orchards in Payson and they are represented on the Celestial room's furniture and on the stained glass windows. The brides' room has 200 year old chairs from a castle in England. source This temple has 5 levels – bottom floor is the baptistry, the 2nd floor has the dressing rooms for endowed members, the 3rd floor has the chapel where you wait before you participate in an endowment session, the 4th floor is where the endowment session takes place and the Celestial room is, and finally, the 5th level has the room sealing rooms. Elder Kent F. Richards said, "There is a general significance that we tend to go up towards heaven as we get to the higher ordinances." source In the baptistry, there is a mural that is a copy of the mural found in the Calgary Alberta Temple. This temple has 19 pieces of art that are original. A lot of those were made by local artists. Provo City Center Temple Provo City Center LDS Temple Facts Dedicated on 3/20/16 by Dallin H. Oaks. 16th of the LDS temples in Utah to be built. One of two LDS temples in Utah that does not follow the typical naming convention for temples. For U.S. temples, the name of the temple is the city the temple is located in, followed by the state the temple is in. For temples outside the U.S., it's the same but followed by the country that the temple is in. For example, Calgary Alberta Temple. source This temple is brand new and was built from the ruins of the Provo Tabernacle. When the Provo Tabernacle was being excavated, the found stencil work on the walls beneath paint. The design of this stenciling was used on the walls of the bride's dressing room. This is the fourth temple that was built from a building that already existed. It was also the second to be built from a tabernacle. The first built from a tabernacle was the Vernal Utah Temple (1997). source More than half the temple is underground. There are three inscriptions of "Holiness to the Lord, the House of the Lord". They are found above the south underground entrance, above the south street level entrance, and above the east door. This temple fulfills the prophetic words of Elder Holland that "like a phoenix out of the ashes" a new temple would be built "on the ground and out of the loving memory of our beloved tabernacle." source The Gothic Arches, found on the highest floor of the temple in the sealing and instruction rooms, are a unique feature of the Provo City Center Temple. This design is not found in any other LDS temple. This is one of three temples that has four corner towers surrounding the central tower. The other temples that feature this are the Oakland California Temple and the Cochabamba Bolivia Temple. This temple has finials, art glass, and other things that were salvaged from the Provo Tabernacle. same source as above The main entrance of this temple is underground. This is the only temple in the world with that. This temple has a recurring theme of the columbine flower. It is a flower that the Mormon settlers would have seen in the mountains in Utah County as it is native to the Rocky Mountains. "A 4-inch-high, hand-carved piece of the original tabernacle pulpit is now in the temple's chapel. The tabernacle's pulpit was removable and had been moved for a musical performance when the fire occurred in late December 2010." When you go in the ground-level entrance there is a big stained-glass piece of work that has the Savior as a shepherd. It is behind the reception desk. 120 years ago it was in a New York Presbyterian church. A member of the LDS church bought the damaged pieces and donated them to the temple. They were then fixed and restored for use in the Provo City Center Temple. Related Content: 19 Returned Missionaries Give You Their Best Mission Advice Provo Utah Temple Provo LDS Temple Facts Dedicatory prayer written by President Joseph Fielding Smith was read by President Harold B. Lee on 2/9/72. The most used temple in the world, due to BYU and the MTC being within the temple district. This temple has more endowment sessions performed every year than any other temple. It has had the most every year since it opened. Originally, this temple did not have an Angel Moroni and the spire was painted gold. This changed when the temple was renovated in 2003 though. The spire was painted white and Angel Moroni was added. source Has 6 ordinance rooms and 12 sealing rooms. This temple was built because the other LDS temples in Utah (Salt Lake, Manti, and Logan) were getting so overcrowded. source Often goes by nicknames such as cupcake and spaceship. This temple was designed to look like and represent the cloud of pillar of fire that helped guide the Israelites under the shadow of Mount Sinai. source Salt Lake Temple Salt Lake LDS Temple Facts Dedicated on 4/6/1893 by President Wilford Woodruff. Largest temple in the world Took 40 years to build One of three temples dedicated on April 6th (St. George and Palmyra New York are the others). When the Latter-Day Saints first got to the Salt Lake Valley, Brigham Young said this was where they would put the temple after he put his cane in the dry ground. This happened on July 28, 1847. source The construction of the temple was discussed with the Saints at General Conference in 1852. The walls at the base of this temple are nine feet thick and the top walls are six feet thick. This was because Brigham Young wanted to make sure that the temple would last through the Millennium. Building the Salt Lake Temple was fulfilling Isaiah's prophecy- "And it shall come to pass in the last days, that the mountain of the Lord's house shall be established in the top of the mountains, and shall be exalted above the hills; and all nations shall flow unto it" (Isaiah 2:2). source Before getting to the Celestial room, you go through four ordinance rooms. This temple still does live acting for the endowment presentation. This is the first temple to have an Angel Moroni statue where he is standing. The first plans for this temple had an Angel Moroni on the east central spire and another one on the west side. At the base and meridian on the southeast corner of Temple Square there is a little statue. It was decided on August 3, 1847 that this was the point where the streets were named and numbered. source Sandstone was originally used for the foundation. During the Utah War, the foundation was buried and the lot made to look like a plowed field to prevent unwanted attention from federal troops. After tensions eased in 1858 and work on the temple resumed, it was discovered that many of the foundation stones had cracked, making them unsuitable for use. Although not all of the sandstone was replaced, the inadequate sandstone was replaced. source The Salt Lake Temple has stone from Little Cottonwood Canyon. It was first brought into Salt Lake by ox, then by railroad. source Wedding parties used to happen behind the sealing rooms inside the temple's walls. If you were visiting the temple for a wedding prior to the 20th century, you would use the east doors, which are no longer used. Brigham Young died before the temple finished building. This temple features a beehive, clasped hands, earthstones, sunstones, moonstones, starstones, cloudstones, shaking hands, an all seeing eye, towers, squares, circles, and Ursa Major (also known as The Big Dipper). Learn what these symbols mean here. One year before the dedication of this temple, there was a capstone ceremony. It was at this ceremony that the Hosanna shout was first done. It is now a part of all temple dedications. source The temple was damaged by a tornado in 1999, a bombing on 4/10/1910, and another bombing on 11/14/1962. 3-5 million tourists visit this temple each year. source The original cornerstones made of firestone were replaced by granite. There are rooms in the temple that are used for weekly meetings by the First Presidency and the Quorum of the Twelve Apostles. One of the rooms used for this is the Holy of Holies, which is not seen in other temples. source The temple's granite is from Little Cottonwood Canyon Truman O. Angell, the architect of the Salt Lake Temple, died before the temple finished. source Does not follow typical naming convention for temples, otherwise it would have been called the Salt Lake City Utah Temple. Lorenzo Snow gave Joseph Henry Dean, one of the temple's carpenters, the position of temple custodian and said, "we're going to give you this position as custodian of the temple and you can go on and spend your remaining days, nights and life in the temple." source Because the temple could be affected by a future earthquake, 2019 renovations will include installing base isolation systems in an attempt to save it. source On the southeast corner of Temple Square there is a marker that says "United States Meridian Base 1869". United States officials used this spot for an observatory that determined the Mountain Daylight time zone. It also established the standard time throughout the U.S. source This was the first temple to have an Angel Moroni statue. It is a representation of the Book of Mormon. Currently undergoing renovation which should be finished in 2024. Read about the renovation plans here. There are sculpture niches next to 2 doors on the east side. Memorial statues of Joseph Smith and Hyrum Smith were made to go in these niches. They were there for many years, but were moved to the Joseph Smith Memorial Gardens on June 27, 1911. They were originally in the niches because the Jerusalem Temple's entrance was guarded by priests. Cyrus Edwin Dallin sculpted this temple's Angel Moroni statue He is not a member of the church. source Related Content: 11 Perfect Gift Ideas For LDS Sister Missionaries St. George Utah Temple St. George LDS Temple Facts Dedicated on 4/6/1877 by Daniel H. Wells. The oldest temple used by the Church. Because the temple sits above underground springs, a solid foundation needed to be built. Workers took lava rock from a nearby mesa and pounded it into the ground with a cannon filled with lead, weighing close to a thousand pounds. The foundation took more than two years to complete. source Women used silk from Utah to make carpets for the hallways and fringe for the pulpits and altars. The men who worked on this temple received half of their pay in cash and the other half in Tithing Office checks. The local members helped one out of every 10 days as tithing labor. source The temple's white exterior symbolizes purity and light. General Conference in 1877 was held at this temple. One of four temples dedicated on April 6th. The others are Salt Lake, Palmyra, and Rio de Janeiro Brazil. source There were two times that Wilford Woodruff was in this temple and had the United State's Founding Father's come to him and ask why their temple work had not been completed yet. source The temple originally had a "squatty" tower that Brigham Young did not like. It was struck by lightning and burned down after Brigham Young passed away. The tower was rebuilt to be taller and more elegant looking. Brigham Young disliked the original "squatty" tower of the St. George Utah Temple. Shortly after his death, it was struck by lightning and burned to its base. The tower was rebuilt taller and with a more elegant shape. Has 18 sealing rooms, which is the most out of any temple. source The founding fathers came to Wilford Woodruff in the St. Gorge temple and ask that their work be done. From the Journal of Discourses we learn that Wilford Woodruff was baptized in proxy for signers of the Declaration of Independence, and fifty other eminent men, making one hundred in all, including John Wesley, Columbus, and others. Wilford Woodruff baptized Brother McAllister for every President of the United States except Martin Van Buren, James Buchannan and Ulysses S Grant. The signers of the Declaration of Independence also had their endowments done for them. source This temple was the last one to be dedicated while Brigham Young was the prophet. It was also the first temple to perform endowments for the dead. source No one died during the temple's construction. Miraculously, John Burt fell 70 feet from the scaffolding and was not expected to live. He did and was back at the building site within two weeks. The temple will undergo renovation on November 4, 2019 and is expected to be completed in 2022. See what the renovation plans are here. David Henry Cannon, Jr said Brigham Young said, "We will wall it up and leave it here for some future use but we cannot move the foundation this spot was dedicated by the Nephites they could not built it the temple but we can and will build it for them." source Related Content: 15 LDS Modest Fashion Bloggers You'll Love Following Vernal Utah Temple Vernal LDS Temple Facts Dedicated on 11/2/97 by President Gordon B. Hinckley. The Uintah Stake Tabernacle was dedicated in 1907 by Joseph F. Smith. He said that he "would not be surprised if the day would come when a temple would be built in your midst here." This became true when it turned into the temple. There was a 7 year old at the dedication, Porter Merrill, who heard him say that and was also at the temple dedication. source The building of this temple used to be the Uintah Stake Tabernacle. It was the first time a building was converted into a temple. source In building the temple. 16,000 bricks were taken from the Reader Home to replace cracked and chipped bricks on the temple façade. Granite plaques on the temple show the numbers 1901, 1997, and 1907. The first refers to the year that the walls and roof were completed, the second refers to the dedicated, and lastly the year the temple was dedicated. source The Angel Moroni statue was first painted gold and later the paint was removed and given a finish of gold leaf. Are there any facts that I've missed about any of the LDS temples in Utah? Let me know below in the comments. I'd like to hear from you. Previous Post: « How To Style Jean Jackets: 12 Outfit Ideas To Copy Next Post: Modest Clothing For Women: Everything You Need To Know »
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
6,226
5 Best Psychedelic Horror Games of All Time August 23, 2022 Shina Gaming 0 5. We Happy Few 4. Layers of Fear 3. The Evil Within 2. Infliction 1. P.T. There's something awfully intriguing about watching a helpless protagonist spiral down a rabbit hole, isn't there? What's better is having said rabbit hole laced with psychedelic elements that knock not only the character for six, but the one tugging on the strings from the comfort of their own homes. And if you're wanting to talk about trippy video games, in general, then you'll be pleased to know that the survival horror genre harbors hallucinatory episodes like they're going out of fashion. It's no surprise that a horror game would only want to trip the player out when concocting its home brew, which is why psychedelics are seen as near-perfect ingredients for most, if not all modern titles. Not only do they keep players on their toes, but they also bring a certain level of unpredictability to the table, which is a sure-fire way of keeping the level of immersion at an all-time high. If that's the sort of thing that whets your appetite, then you should definitely throw yourself at these five entries. We Happy Few – Announce Trailer | PS4 We Happy Few is a psychedelic cocktail of stealth-based survival horror and puzzle-solving, embellished with a mesmerising story-driven campaign. As a game with countless psychedelic components, its players can expect to see an ever-changing world, one that employs a population so merry and kind-hearted, that it's only natural for trespassers to suspect it for being in possession of a much darker reality. We Happy Few revolves around Joy, a drug that tricks its consumer into thinking the world is as peaceful as the townsfolk make it out to be. In reality, though, the junkies that run the town are flesh-eating sociopaths, and it's only after you refuse to take your daily dosage that you begin to see the world for what it really is. The problem is, people aren't overly happy about you waking up to see the truth, and they're more than happy to do all in their power to put you back on cloud nine, dead or alive. Layers of Fear – Gameplay Trailer | PS4 Layers of Fear brings a fresh lick of paint to a dying art. Literally. As a survival horror game based entirely on the artistic curveballs that budding creators endure, players can witness the heart and soul that goes into a masterpiece, where innovative ideas fuse with the morally bankrupt minds that harness them. In Layers of Fear, players assume the role of an established painter. After losing his dearly beloved to murder, the artist sets out to create the so-called magnum opus, a piece of art that embodies the heart and soul of his late wife. The hollow halls of his former family home, however, tell a different tale — one stoked by betrayal and bloodlust. As players, you must connect the dots and grasp an understanding of the family's history. For every tool you acquire, a new revelation will come to pass, providing you with the muse to paint the truth in black and white. The Evil Within – Launch Trailer The Evil Within is a psychological horror that fuses twisted worlds with the human psyche. As detectives sent to investigate a series of mass killings at a local hospital, players must forage for clues in order to comprehend the fates of those involved. To obtain said clues, you must enter a realm of horror, a place where nightmares come to life in the most horrific ways imaginable. The Evil Within blends its nightmarish world with a classic third-person shooter format. Its setting, which transforms on a constant basis, provides players with a sense of feeling isolated and without guidance. With that, anyone who is brave enough to conquer Beacon Mental Hospital can expect quite the number of curveballs, many of which will push the human mind well beyond its breaking point. Infliction: Extended Cut – Console Launch Trailer Infliction is a walking simulator that adds a thick coating of psychological horror to its outer layers. Like other games of its kind, players are tasked with roaming a creepily vacant home with the intention of unmasking a forbidden truth. The truth in question herein revolves around the murder of the protagonist's wife. As it happens, the grieving husband harbors a sinister secret, one that the ghosts of his past wish to extract. It's your role, of course, to aid the looming entities in the twisted phishing expedition for answers. Infliction tells the tale of a shallow partner who's engulfed in the flames of guilt and grievance. With his wife stalking the halls of his former family home, the survivor must learn to atone for his mistakes and, in turn, banish the ghost that only wishes to harm those who wronged her. The problem is, the spirit has the power to not only physically harm her oppressors, but toy with their minds. With that, players can expect one monster of a journey through the rabbit hole. P.T. – Gamescom 2014 Trailer Silent Hills will forever be known as "the one that got away." Due to the conflict with Konami, Hideo Kojima scrapped the would-be survival horror game before it could ever really establish roots in the market. The only thing it left in its wake, of course, was the short but horrifyingly beautiful tech demo, P.T.. Although P.T. has since been wiped from storefronts, the experience still holds a permanent residency in players' minds, and forever reminds the community of what could've been. It was a masterclass in building suspense through the power of psychedelia, and something that no other developer in the world could ever dare to replicate, regardless of experience. And to think, this is a status that was achieved from a mere sneak preview of a much larger game. Say what you will about Kojima, but you can't deny the fact that the guy knows how to utilize the psychedelic elements, even in the smallest doses. So, what's your take? Do you agree with our top five? Are there any psychedelic games we should know about? Let us know in the comment section below! 5 Worst Puzzle Games of All Time Top 5 Best Gaming Forums in 2022
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
5,173
0 Ricky Canyon Access Road is a $360,000 property on a 36.60 acre lot located in Topaz, NV. Here is your opportunity to obtain your own piece of Nevada wilderness and history! 36.6 acres of rugged beauty surrounded by BLM wilderness area for endless privacy and seclusion. Build your off-grid dream dwelling, use for hunting, wild game, paintball, airsoft or set up your prepper headquarters. With an old cabin on site as well as a patented mine, this property holds endless possibility. I am interested in 0 Ricky Canyon Access Road, Topaz, NV 89444.
{ "redpajama_set_name": "RedPajamaC4" }
5,382
Q: angularjs not caching resource data . ( even after using cacheFactory ) I am trying to cache the response with angularjs but its not happening . code #1 var app = angular.module("jsonService", ["ngResource"]); app.factory("JsonFactory", function($resource,$cacheFactory) { var cache = $cacheFactory('JsonFactory'); var url = "myurl?domain=:tabUrl"; var data = cache.get(url); if (data==undefined) { var retObj = $resource(url, {}, { list: { method: "GET", cache: true } }); data = retObj; cache.put(url, data); }; return cache.get(url); }); code #2 var app = angular.module("jsonService", ["ngResource"]); app.factory("JsonFactory", function($resource) { var url = "myurl?domain=:tabUrl"; console.log(url); var retObj = $resource(url, {}, { list: { method: "GET", cache: true } }); return retObj; }); after both the code i wrote . when looking in to dev tools there always goes a XHR request in Network tab. obviously : date does not changes . ( that's the whole point of caching ) A: After reading some of your responses, I think that what you are asking, is why does the network tab show a 200 response from your server, while using angular caching. There are two caches. The first cache is angular's cache. If you see an xhr request in the network tab at all, then that means angular has decided that the url does not exist in its cache, and has asked the browser for a copy of the resource. Furthermore, the browser has looked in it's own cache, and decided that the file in its cache does not exist, or is too old. Angular's cache is not an offline cache. Every time you refresh the browser page, angular's caching mechanism is reset to empty. Once you see a request in the network tab, angular has no say in the server response at all. If you're looking for a 304 response from the server, and the server is not providing one, then the problem exists within the server and browser communication, not the client javascript framework. A 304 response means that the browser has found an old file in its cache and would like the server to validate it. The browser has provided a date, or an etag, and the server has validated the information provided as still valid. A 200 response means that either the client did not provide any information for the server to validate, or that the information provided has failed validation. Also, if you use the refresh button in the browser, the browser will send information to the server that is guaranteed to fail (max-age=0), so you will always get a 200 response on a page refresh. A: According to the documentation for the version of angular that you are using, ngResource does not support caching yet. http://code.angularjs.org/1.0.8/docs/api/ngResource.$resource If you are unable to upgrade your angular version, you may have luck configuring the http service manually before you use $resource. I'm not exactly sure of syntax, but something like this: yourModule.run(function($http) { $http.cache=true; }); A: $cacheFactory can help you cache the response. Try to implement the "JsonFactory" this way: app.factory("JsonFactory",function($resource,$cacheFactory){ $cacheFactory("JsonFactory"); var url="myurl?domain=:tabUrl"; return{ getResponse:function(tabUrl){ var retObj=$resource(url,{},{list:{method:"GET",cache:true}}); var response=cache.get(tabUrl); //if response is not cached if(!response){ //send GET request to fetch response response=retObj.list({tabUrl:tabUrl}); //add response to cache cache.put(tabUrl,response); } return cache.get(tabUrl); } }; }); And use this service in controller: app.controller("myCtrl",function($scope,$location,JsonFactory){ $scope.clickCount=0; $scope.jsonpTest = function(){ $scope.result = JsonFactory.getResponse("myTab"); $scope.clickCount++; } }); HTML: <script src="//ajax.googleapis.com/ajax/libs/angularjs/1.0.8/angular.min.js"></script> <script src="//ajax.googleapis.com/ajax/libs/angularjs/1.0.8/angular-resource.js"></script> <script src="js/ngResource.js"></script> <body ng-app="app"> <div ng-controller="myCtrl"> <div>Clicked: {{clickCount}}</div> <div>Response: {{result}}</div> <input type="button" ng-click="jsonpTest()" value="JSONP"/> </div> </body> Screenshot: [EDIT] for html5 localStorage solution JSBin Demo .factory("JsonFactory",function($resource){ var url="ur/URL/:tabUrl"; var liveTime=60*1000; //1 min var response = ""; return{ getResponse:function(tabUrl){ var retObj=$resource(url,{},{list:{method:"GET",cache:true}}); if(('localStorage' in window) && window.localStorage !== null){ //no cached data if(!localStorage[tabUrl] || new Date().getTime()>localStorage[tabUrl+"_expires"]) { console.log("no cache"); //send GET request to fetch response response=retObj.list({tabUrl:tabUrl}); //add response to cache localStorage[tabUrl] = response; localStorage[tabUrl+"_expires"] = new Date().getTime()+liveTime; } //console.log(localStorage.tabUrl.expires+"..."+new Date().getTime()); return localStorage[tabUrl]; } //client doesn't support local cache, send request to fetch response response=retObj.list({tabUrl:tabUrl}); return response; } }; }); Hope this is helpful for you.
{ "redpajama_set_name": "RedPajamaStackExchange" }
9,452
Bo's daughter succumbed to a rare liver disease in 2003, three years after she graduated from Marquette. So he and his wife, Candy, started the Nicole Ellis Foundation, which is funded mainly through private gifts and the annual Warrior Golf Outing. The foundation also provides an annual scholarship through Marquette's Ethnic Alumni Association and supports the American Liver Foundation. Kevin is an institution. As senior vice president of public and community relations for the Baltimore Ravens, an organization he has been with for 31 years, he oversees an award-winning staff responsible for the team's website and most of the print publications. He also has provided media relations assistance at 24 Super Bowls and two Pro Bowls and was Marquette's sports information director during the final years of Al McGuire's tenure. As a season ticket holder for Marquette men's and women's basketball, Craig and his wife, Wendy, don't miss a game. And as a major supporter of Marquette athletics, Craig doesn't miss an opportunity to help the university's student-athletes and proud athletic tradition continue to soar to greater heights. Playing basketball at Marquette was just the beginning of Tom's athletic connection to the university. He served four terms as M Club president, the longest in history; was a member of the men's basketball alumni reunion committees; is on the Marquette Athletics Board; and, with wife Pat, has committed to establishing endowed scholarships for men's and women's basketball through their estate plans.
{ "redpajama_set_name": "RedPajamaC4" }
971
Reflexology is based on the principle that reflex points on the soles, tops, and sides of the feet correspond to different areas of the body. In this way, the feet can be seen as a 'map' of the body. By applying specialised massage techniques to specific reflex points – using the thumbs, fingers and knuckles – the aim of a reflexology treatment is to help restore balance to the body naturally, and improve the client's general well-being. We start with a full consultation, asking various questions about your health and lifestyle, to ensure reflexology is right for you. For the treatment itself you will remain fully clothed, simply removing your shoes and socks. You'll be invited to relax on a reclining chair or treatment couch, or to put your feet up on a footstool. We will gently clean your feet before applying a fine powder, cream of oil, to help provide a free-flowing treatment, and then start gently massaging and stretching your feet and ankles. As the treatment progresses, a variety of different reflexology techniques will be used to 'work' the reflex points on each foot, including a caterpillar-like movement called 'thumb walking'. The areas treated and pressure applied will be adapted to suit your individual needs. Treatment generally lasts for 45 minutes to an hour, though shorter reflexology sessions may be more appropriate in some instances.
{ "redpajama_set_name": "RedPajamaC4" }
144
<?php namespace Sturdy\Activity\Meta\Type; use stdClass; /** * Class BooleanType * @package Sturdy\Activity\Meta\Type */ final class BooleanType extends Type { const type = "boolean"; /** * Constructor * * @param string|null $state the objects state */ public function __construct(string $state = null) { } /** * Get descriptor * * @return string */ public function getDescriptor(): string { return self::type; } /** * Set meta properties on object * * @param stdClass $meta * @param array $state */ public function meta(stdClass $meta, array $state): void { $meta->type = self::type; } /** * Filter value * * @param &$value bool the value to filter * @return bool whether the value is valid */ public function filter(&$value): bool { if (is_string($value)) $value = trim($value); $boolean = filter_var($value, FILTER_VALIDATE_BOOLEAN, FILTER_NULL_ON_FAILURE); if ($boolean === null) return false; $value = $boolean; return true; } }
{ "redpajama_set_name": "RedPajamaGithub" }
1,026
How 'Little Women' Star Florence Pugh Paid Tribute to Late Nick Cordero Following His Death By Busayo Ogunjimi https://news.amomama.com/216889-one-nick-corderos-last-performances-vide.html Actor and musician Nick Cordero recently passed away. To honor the death of the Broadway star, Florence Pugh, from "Little Women," shared a clip of one of his last performances. Hollywood mourns the death of 41-year-old Nick Cordero, who suffered complications following his treatment of the coronavirus. Pugh's post featured Cordero on stage, entertaining a cheering audience. Her caption gave an insight into the events surrounding the deceased and his family before his death. Nick Cordero and Amanda Kloots at the 2017 Drama Desk Awards on June 4, 2017. | Photo: Getty Images Pugh first called Cordero one of the "great ones," and a "friend," then she shared how the Broadway star and his wife of three years, Amanda Kloots battled the virus since March. Though Kloots tested negative, she has been at her husband's bedside for about 90 days, and through the different stages of his health after he tested positive for the novel coronavirus. The caption under the post revealed that the 41-year-old had no pre-existing condition and that during his April performance showed via the clip, his woman, who was 7-months pregnant at the time, enjoyed her husband's onstage performance. In conclusion, she said: "It is so shocking and devastating to see one of your own come down as hard as he did...What can WE do to help?… Help the world by continuing to take this virus seriously." Pugh advised her Instagram followers to wear their masks, practice social distancing, and regular hand washing. Sources reported that Cordero died at the Cedars-Sinai Medical Center, Los Angeles, on Sunday at 11.40 am. His wife shared an emotional post of her deceased husband, who is also the father of her 1-year-old son, Elvis Eduardo. A GoFundMe account was set up to take care of her husband's medical expenses. Her post showed that she was grieving on the inside, as she mentioned this, but also felt peace as she believes Cordero is in heaven. Kloots stated that he was a fantastic entertainer and father. She went on to thank those who have supported her family. The mother of one has been consistent in filling the public in on details concerning her husband's health right from day one when he was admitted to the medical center. Amanda Kloots Says Husband Nick Cordero Is out of Coma instagram.com/amandakloots Nick Cordero's Wife Says He Is Struggling With His Blood Pressure Amid Battle With COVID-19 Nick Cordero Fights for His Life after Testing Positive for COVID-19 — Meet His Beautiful Wife Amanda Kloots As time went on, Cordero's situation in the hospital became critical. He suffered two mini-strokes and had some of his body parts amputated. Kloots' family got the deserved support from fans, and a GoFundMe account was set up to take care of her husband's medical expenses. Around the country, celebrities and friends have shared their pain in the heartbreaking news and have tried to reach out to Cordero's wife. We at news.AmoMama.com do our best to give you the most updated news regarding the COVID-19 pandemic, but the situation is constantly changing. We encourage readers to refer to the online updates from CDС, WHO, or Local Health Departments to stay updated.Take care! Inside Jackie Kennedy's $65M Oceanfront Mansion Where 3 Generations of Kennedys Lived Old Businessman Pretends to Disinherit His Grandson to Test His Fiancée – Story of the Day
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
4,833
Boeing agrees $200m penalty for misleading investors Transport & Aviation / 23rd September 2022 / Ed McKenna Former Boeing chief executive Dennis A Muilenburg is to pay a penalty of $1m in settlement of charges that he and the company violated US antifraud laws by withholding knowledge of the causes of crashes of the Boeing 737 Max. Boeing itself will have to pay $200m. Both the company and Muilenburg neither admit nor deny the findings of an investigation by the US Securities and Exchange Commission, but agreed to pay the penalties and consented to a cease and desist order. Two fatal crashes by 737 Max jets took 346 people's lives, led to the grounding of the entire fleet, and investigations around the world. The SEC probe was into statements made by the company and its former chief executive which could have affected the company's share price, and the implications for investors in the company. According to the SEC, after the first crash, Boeing and Muilenburg knew that the new airplane's MCAS control system "posed an ongoing airplane safety issue, but nevertheless they assured the public that the 737 Max airplane was 'as safe as any airplane that has ever flown the skies'. "Later, following the second crash,Boeing and Muilenburg assured the public that there were no slips or gaps in the certification process with respect to MCAS, despite being aware of contrary information," the SEC concluded "There are no words to describe the tragic loss of life brought about by these two airplane crashes," said SEC chair Gary Gensler. "In times of crisis and tragedy, it is especially important that public companies and executives provide full, fair, and truthful disclosures to the markets. "The Boeing Company and its former CEO failed in this most basic obligation. They misled investors by providing assurances about the safety of the 737 MAX, despite knowing about serious safety concerns. The SEC remains committed to rooting out misconduct when public companies and their executives fail to fulfil their fundamental obligations to the investing public." According to the SEC's order, one month after Lion Air Flight 610 — a 737 MAX — crashed in Indonesia in October 2018, Boeing issued a press release, edited and approved by Muilenburg, that selectively highlighted certain facts from an official report of the Indonesian government suggesting that pilot error and poor aircraft maintenance contributed to the crash. The press release also gave assurances of the airplane's safety, failing to disclose that an internal safety review had determined that MCAS posed an ongoing "airplane safety issue" and that Boeing had already begun redesigning MCAS to address that issue. "Boeing and Muilenburg put profits over people by misleading investors about the safety of the 737 MAX all in an effort to rehabilitate Boeing's image following two tragic accidents that resulted in the loss of 346 lives and incalculable grief to so many families," said Gurbir Grewal, director of the SEC's enforcement division. "Public companies and their executives must provide accurate and complete information when they make disclosures to investors, no matter the circumstances. When they don't, we will hold them accountable, as we did here." The SEC's orders against Boeing and Muilenburg find that they negligently violated the antifraud provisions of federal securities laws. A Fair Fund will be established for the benefit of harmed investors. The US Justice Department fined Boeing $2.5 billion last year, on charges of fraud and conspiracy in connection with the two crashes. Authorities said the company had engaged "in an effort to cover up their deception" and chose "the path of profit over candor by concealing material information" from the Federal Aviation Administration, the aviation regulator. Pic: Getty Images
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
7,237
This E3M one-day seminar is aimed at CEOs of social enterprises, Chairs and non-Executive Directors. Building on E3M's previous work in this area, it explores the key issues for successful governance, for social enterprises seeking to maximise their impact. Combining expert input and case studies, the day will be highly participative, drawing out and sharing the experience and learning of those taking part. The event is being developed with E3M's core partners, BWB and Numbers for Good, and with special support from Big Society Capital and Telos Partners. Book via Eventbrite link at foot of page. How happy are you with your impact? What's getting in the way of maximising impact? To what extent do impact considerations drive your decision making? Do we take enough risks? Do we invest enough? Should we invest more? What stops us from investing for impact? How ready are we to invest for impact? How capable are we at assessing and managing risk? Reflections on good governance/board practice – e.g. Clarity of purpose; Organisation (diversity, size, structure, development, succession, processes etc.); Dynamics (relationships, behaviours). Role and Purpose, including e.g. Process and structure, including e.g. What's on the agenda? What SHOULD be on the agenda? How do we make decisions? Consensus? Governance through different phases of organisational development and growth. Opportunity to reflect on key points from the day and identify actions, next steps, what we are going to do differently.
{ "redpajama_set_name": "RedPajamaC4" }
6,993
Where Americans Are Moving The red states may have lost the presidential election, but they are winning new residents, largely at the expense of their politically successful blue counterparts. For all the talk of how the Great Recession has driven people — particularly the "footloose young" — toward dense urban centers, Census data reveal that Americans are still drawn to the same sprawling Sun Belt regions as before. read more » Flocking Elsewhere: The Downtown Growth Story The United States Census Bureau has released a report (Patterns of Metropolitan and Micropolitan Population Change: 2000 to 2010.) on metropolitan area growth between 2000 and 2010. The Census Bureau's the news release highlighted population growth in downtown areas, which it defines as within two miles of the city hall of the largest municipality in each metropolitan area. read more » A Look at Commuting Using the Latest Census Data Continuing my exploration of the 2011 data from the American Community Survey, I want to look now at some aspects of commuting. Public transit commuting remains overwhelmingly dominated by New York City, with a metro commute mode share for transit of 31.1%. There are an estimated 2,686,406 transit commuters in New York City. All other large metro areas (1M+ population) put together add up to 3,530,932 transit commuters. New York City metro accounts for 39% of all transit commuters in the United States. read more » Facebook's False Promise: STEM's Quieter Side Of Tech Offers More Upside For America Facebook's botched IPO reflects not only the weakness of the stock market, but a systemic misunderstanding of where the true value of technology lies. A website that, due to superior funding and media hype, allows people to do what they were already doing — connecting on the Internet — does not inherently drive broad economic growth, even if it mints a few high-profile billionaires. read more » The Best Cities For Tech Jobs With Facebook poised to go public, the attention of the tech world, and Wall Street, is firmly focused on Silicon Valley. Without question, the west side of San Francisco Bay is by far the most prodigious creator of hot companies and has the highest proportion of tech jobs of any region in the country — more than four times the national average. Yet Silicon Valley is far from leading the way in expanding science and technology-related employment in the United States. read more » Smart Growth: The Maryland Example This is Part Two of a two-part series. Evidence that people just don't like Smart Growth is revealed in findings from organizations set up to promote Smart Growth. In 2009, the Washington Post reported, "Scholars at the National Center for Smart Growth Research and Education found that over a decade, smart growth has not made a dent in Maryland's war on sprawl." read more » Smart Growth and The New Newspeak It's a given in our representative system that policies adopted into law should have popular support. However, there is a distinction to be made between adopting a policy consistent with what a majority of people want, and pushing a policy while making dubious claims that it harnesses "the will of the people." read more » The Great Reordering of the Urban Hierarchy A delegation from Chicago is in Brussels this week to sell the city as a tourist destination in advance of the forthcoming NATO Summit. A Phil Rosenthal column explains that the city has a long way to go: read more » The Expanding Wealth Of Washington Throughout the brutal and agonizingly long recession, only one large metropolitan area escaped largely unscathed: Washington, D.C. The city that wreaked economic disasters under two administration last year grew faster in population than any major region in the country, up a remarkable 2.7 percent. The continued steady growth of the Texas cities, which dominated the growth charts over the past decade, pales by comparison. read more » Arlington and Shenzhen: A Tale of Two Cities by Iqbal Ahmed 02/16/2012 Seven thousand miles separate Arlington, Virginia and Shenzhen, China. Two continents apart, these two cities could not be more different. Yet they are similar, geopolitically and globally. The characteristics of today's globalization have united and connected cities like Arlington and Shenzhen. read more »
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
2,516
Cloud Computing and the Hype Cycle We are about 6 months into the hype cycle, with a lot left to go. What will the industry be feeling over the coming months and years? There was an article today at ComputerWeekly, called "The end of the IT Department, is it in the cloud?" In it, Tom Austin of Gartner says he expects uptake of cloud services to increase dramatically by 2013. If it is true that the hype-cycle is cresting, there will be 5 years of trough, which is unlikely. We haven't really gotten started with cloud computing and there is a lot of hype left to be generated. There was a lot more hype around Web 2.0 in my opinion and cloud computing has a lot more substance. Austin agrees saying, cloud computing is "probably the single biggest magnitude wave of change that we've ever seen." The situation now though, is that those on the edge of IT are getting tired of hype in general. It started with the dot com boom, then Web 2.0, Social Networking, and now there are a lot more skeptics out there. Anything getting buzz in the industry is going to get a lot more scrutiny as well -- and faster. There are a lot of parallels between the hype around Web 2.0, Social Networking, and Cloud Computing, but the significant difference is that cloud computing is more difficult for the consumer to understand, because few of them interact directly with cloud computing services. For them, cloud computing is mostly behind the scenes. They use and interact with Web 2.0 websites and social networks, but they know little of the infrastructure behind the systems. For that single reason, the buzz will not reach such dramatic levels as Web 2.0. Still though, "cloud computing" only really got started at the end of 2007, it is premature to be suggesting we've reached the crest of the wave. We have yet to really understand what cloud computing is and all the benefits it will offer society. Most conversations about the topic outside conferences are still trying to explain exactly what cloud computing is. Until we understand what it is, how can we comprehend the magnitude of the influence? You could say we don't understand it enough and therefore expect too much, but the opposite is true. <div id=blog20190718172231> </div> <script> getContent("/blog/Cloud+Computing+and+the+Hype+Cycle&noheader=true&nofooter=true" ,"blog20190718172231"); </script> <script src="http://www.qrimp.com/blog/Cloud+Computing+and+the+Hype+Cycle&noheader=true&nofooter=true&jsembed=true"></script>
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
1,880
Resource: HRSA 2019-2020 Health Equity Report Materials & News / Health Equity, Resources The Health Resources and Services Administration (HRSA) Office of Health Equity released its 2019-2020 Health Equity Report on October 14. The report has a special feature on housing and health inequalities in the U.S. and shows the impact of housing status and housing conditions – a key social determinant – on population health and health equity. The report indicates that substantial progress has been made nationally for all Americans on life expectancy, cardiovascular disease, cancer, diabetes, and influenza and pneumonia. However, health inequities between population groups and geographic areas persist. Key take-aways from the report include: Life expectancy for all Americans increased from 68.2 years in 1950 to 78.6 years in 2017; however, American Indians/Alaska Natives (AIAN) and Blacks life expectancy was 74.3 and 76.0 years respectively. Homeless patients (28%) are significantly more likely to report serious psychological distress than community health center patients (14%) and public housing primary care center patients (16%). Children living in subsidized housing units are four times more likely to be in fair/poor health than those living in owner-occupied homes (4.4% vs. 1.1%). The report findings will help HRSA and its partners to improve health outcomes and address health disparities through access to quality services, a skilled health workforce and innovative, high-value programs.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
9,127
Voltage Sequencer for Moog Mother-32 Made from 4040 Counter, 4051 Multiplexer, and a 40106 Oscillator « Adafruit Industries – Makers, hackers, artists, designers and engineers! This is a really simple and straightforward voltage sequencer (that sounds great!). Sebastian from little-scale demonstrates this using his Moog Mother-32 but it's totally applicable for other devices as well. A simple voltage sequence for Moog Mother-32 and other devices. The cost is very low, and the concept is somewhat expandable. No additional power supply is used. Instead, the VC Mix Output is used to power the sequencing circuit, consisting of a 40106 oscillator, a 4040 counter and a 4051 multiplexer.
{ "redpajama_set_name": "RedPajamaC4" }
1,768
<?xml version="1.0" encoding="utf-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:xhtml="http://www.w3.org/1999/xhtml" xmlns:html="http://www.w3.org/1999/xhtml" xml:lang="en"><!--THIS IS A GENERATED FILE. DO NOT EDIT (2)--><head><title> </title><meta name="author" content="Herbert Weir Smyth"/><meta name="generator" content="Text Encoding Initiative Consortium XSLT stylesheets"/><meta name="DC.Title" content="&#xA; &#xA; "/><meta name="DC.Type" content="Text"/><meta name="DC.Format" content="application/xhtml+xml"/><meta http-equiv="Content-Type" content="application/xhtml+xml; charset=utf-8"/><link href="tei.css" rel="stylesheet" type="text/css"/><link href="alph-tei.css" rel="stylesheet" type="text/css"/></head><body id="TOP"><div class="Chapter" id="body.1_div1.4_div2.5"><h2 class="institution"/><h2 class="department"/><h1 class="maintitle"> AGREEMENT: THE CONCORDS </h1><p class="right"><span class="upLink"> Up: </span><a class="navigation" href="body.1_div1.4.html">Part IV: Syntax</a><span class="previousLink"> Previous: </span><a class="navigation" href="body.1_div1.4_div2.4.html"> EXPANSION OF THE SIMPLE SENTENCE </a><span class="nextLink"> Next: </span><a class="navigation" href="body.1_div1.4_div2.6.html"> THE SUBJECT </a></p> <div class="smythp" id="s925"><h4>925</h4> <p>There are three concords in simple sentences: </p> <p>1. A finite verb agrees with its subject in number and person (<a class="link_ref" href="body.1_div1.4_div2.10.html#s949" title="">949</a>). </p> <p>2. A word in apposition with another word agrees with it in case (<a class="link_ref" href="body.1_div1.4_div2.10.html#s976" title="">976</a>). </p> <p>3. An adjective agrees with its substantive in gender, number, and case (<a class="link_ref" href="body.1_div1.4_div2.14.html#s1020" title="">1020</a>). </p> <p>(For the concord of relative pronouns, see <a class="link_ref" href="body.1_div1.4_div2.30.html#s2501" title="">2501</a>.) <span id="smyth-pb-d0e172865"/> </p> </div> <div class="smythp" id="s926"><h4>926</h4> <p>Apparent violation of the concords is to be explained either by </p> <p><b>a.</b> <i>Construction according to sense</i>, where the agreement is with the real gender or number (e.g. 949 a, 950-953, 958, 996, 997, 1013, 1044, 1050, 1055 a, 1058 b); or by </p> <p><b>b.</b> <i>Attraction</i>, when a word does not have its natural construction because of the influence of some other word or words in its clause (e.g. 1060 ff., 1239, 1978, 2465, 2502, 2522 ff.). This principle extends to moods and tenses (2183 ff.). </p> </div> <div class="stdfooter"><hr/><div class="footer"/><hr/><address> Herbert Weir Smyth. <br/><div xmlns="" class="rights_info"><p class="cc_rights"><a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://creativecommons.org/images/public/somerights20.png"/></a><br/>This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. </p></div><!-- Generated from smyth using an XSLT version 1 stylesheet based on http://www.tei-c.org/Stylesheets/teic/tei.xsl processed using SAXON 6.5.5 from Michael Kay on 2008-11-24T11:31:18-05:00--></address></div><div xmlns="" class="perseus-funder">The National Endowment for the Humanities provided support for entering this text.</div><div xmlns="" class="perseus-publication">XML for this text provided by <span class="publisher">Trustees of Tufts University</span><span class="pubPlace">Medford, MA</span><span class="authority">Perseus Project</span></div></div></body></html>
{ "redpajama_set_name": "RedPajamaGithub" }
6,878
1. We additionally plan to keep servers open all of the time between checks, so you may play as much Fortnite as you like. |This extra herbal strategy of putting hurdles in the path of enemy has become quite widespread technique and has even been replicated in others that followed. Looking again at how blown away I was by Fortnitemares, or even the Survive the Storm element of Save the vacations, I can't help though be incredibly disillusioned. You can easily find out out these gaming web sites with the help of search engines at nice ease. There are a large number of online pet games for sale - some are for cell units, some are for Computer's and some can be played on-line from any system. It is getting them to function which might be the true trick. I purchased the sport fairly early on and had a very arduous time getting into it, Fortnite Battle Royal happened and as they are saying, the remainder is history. Getting |Fortnite Cellular| for Pc is actually easy. After all, good sites to obtain video games online do not come for free. Methods to GET FREE V-BUCKS ON FORTNITE GAME Typically talking, you may get V-bucks by looking for it on Microsoft Retailer. The discussion thread goes on additional on ways to manage their game queue, to get the games you need. That was confirmed in a tweet by Nick Chester, PR manager at Fortnite's writer Epic Games. During a Nintendo Direct at the moment, Epic revealed that Fortnite is coming to Nintendo Switch, and it goes to be obtainable immediately. You can be on the same stage with another ninety nine online gamers, were they must collect weapons in addition to building and protection parts. Here you're going to get very useful info and most necessary, cheats and secrets. Get the Fortnite Generator! This provides gamers a reason to maintain playing every day to allow them to get new clothes or even make a fast buck on the Steam market. That makes utilizing our suppliers quite simple and enjoyable like no other site. Dora the Explorer online sport can be fun in your little one to play with and it is typically an ideal approach to spend a long time with them.
{ "redpajama_set_name": "RedPajamaC4" }
624
Few weeks back I asked you my friends, colleagues, and people from the Scrum community, to support me during the Scrum Alliance Board of Directors elections. I want to use this opportunity to thank you for all the support you gave me and let you know that I'm officially in the board, my period as a member of Board of Directors of Scrum Alliance officially started January 1, 2017. I'm honored to be elected to the Scrum Alliance Board of Directors, because it's where the strategic direction of the organization is decided. We have responsibilities to the entire global membership, and promoting cooperation and transparency will help us strengthen the adoption and implementation of Agile and Scrum. Increasing our strategic and dynamic range requires leaving our comfort zones, and that's what I teach and practice. My experience training and transforming organizations has shown the importance of respect, integrity, and openness. I'm passionate about creating better communities, with a work and life environment that brings happiness as well as success. Thank you all for the support and I'll do my best to help Scrum Alliance and the whole Scrum community to be achieve its mission of transforming the world of work.
{ "redpajama_set_name": "RedPajamaC4" }
3,131
prevents your knee replacement surgeries with scientific and cost-effective Ayurveda techniques. Low Back Pain & sciatica. Swelling and pain in joints due to the deposition of uric acid crystals. Gentle healing to improve and restore functionality after any trauma or accidents & Fibromyalgia. Obesity, Hypertension (High BP), High Cholesterol, Diabetes. Panchakarma treatments are used especially to prevent macro and microvascular complications of diabetes. Asthma, Sinusitis, Allergic Cold/Cough, Chronic Bronchitis, Repeated attacks of Tonsillitis. Irritable Bowel Syndrome, Hyper-acidity and/or Gas, Peptic Ulcer, Chronic Constipation, other chronic diseases. Some conditions, however, need to be treated at an early stage to be effective. And some conditions may be managed in an integrated fashion for the best benefit of the patient. To discuss your problems and the treatment We Provide mail us on ayuskamarishikesh@gmail.com.
{ "redpajama_set_name": "RedPajamaC4" }
6,584
Q: procmail to deliver mail to 2 maildir locations Now I am having Postifx --> procmailrc---> maildire---> dovecot and I need to have another maildir in another folder inside /root as a backup at the same time having /$home/maildir ? First and second both should have mails in maildir format. How can I instruct procmail to have second maildir in another location /root ?. If this second location need be another server, how can I send the mail to there? my current procmailrc file is at /etc and which is ---> :0fw: spamassassin.lock * < 256000 | /usr/bin/spamassassin :0 * ^X-Spam-Status: Yes /home/spammail/Maildir/ LOGFILE="/var/log/procmail.log" DEFAULT="$HOME/Maildir/" MAILDIR="$HOME/Maildir/" Your help expected. Thanks, lasantha A: Youc can use c flag (carbon copy). Add this lines before the one with :0 :0c: /your/backup/Maildir/
{ "redpajama_set_name": "RedPajamaStackExchange" }
5,459
Wells, Jenny C. and Bryan G. Cook. "Special Education Identification." In The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation, edited by Bruce B. Frey, 1561-1562. Thousand Oaks,, CA: SAGE Publications, Inc., 2018. doi: 10.4135/9781506326139.n648. Wells, J & Cook, B 2018, 'Special education identification', in Frey, B (ed.), The sage encyclopedia of educational research, measurement, and evaluation, SAGE Publications, Inc., Thousand Oaks,, CA, pp. 1561-1562, viewed 21 April 2019, doi: 10.4135/9781506326139.n648. Wells, Jenny C. and Bryan G. Cook. "Special Education Identification." The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Ed. Bruce B. Frey. Thousand Oaks,: SAGE Publications, Inc., 2018. 1561-1562. SAGE Knowledge. Web. 21 Apr. 2019, doi: 10.4135/9781506326139.n648. Special education identification refers to the process for determining whether children and youth are eligible to receive services under the Individuals with Disabilities Education Improvement Act, a federal law commonly referred to as IDEA. The primary intent of the law is to ensure that all children with disabilities receive a free and appropriate public education. The IDEA includes a Child Find mandate that requires states to develop and implement a plan to seek, screen, and identify all children and youth between the ages of 3–21 years who may have a disability.
{ "redpajama_set_name": "RedPajamaC4" }
1,183
Developmental pattern of 3-oxo-Δ4 bile acids in neonatal bile acid metabolism Toshiro Inouea, Akihiko Kimuraa, Kumiko Aokib, Masahiko Tohmac, Hirohisa Katoa aDepartment of Paediatrics and Child Health, Kurume University School of Medicine, Fukuoka, Japan, bResearch Institute of Medical Mass Spectrometry, cFaculty of Pharmaceutical Sciences, Health Sciences University of Hokkaido Dr Akihiko Kimura, Department of Paediatrics and Child Health, Kurume University School of Medicine, 67 Asahi-machi, Kurume 830, Japan. AIMS To investigate whether a fetal pathway of bile acid synthesis persists in neonates and infants. METHODS 3-oxo-Δ4 bile acids were determined qualitatively and quantitatively in the urine, meconium, and faeces of healthy neonates and infants, using gas chromatography–mass spectrometry. RESULTS The mean percentage of 3-oxo-Δ4 bile acids in total bile acids in urine at birth was significantly higher than that at 3 or 7 days, and at 1 or 3 months of age. The concentration of this component in meconium was significantly higher than that in faeces at 7 days and at 1 or 3 months of age. CONCLUSIONS The presence of large amounts of urinary 3-oxo-Δ4 bile acids may indicate immaturity in the activity of hepatic 3-oxo-Δ4-steroid 5β-reductase in the first week of postnatal life. Large amounts of this component in meconium may be due to the ingestion of amniotic fluid by the fetus during pregnancy. ketonic bile acid 3-oxo-Δ4 bile acid 3-oxo-Δ4-steroid 5β-reductase meconium gas chromatography–mass spectrometry http://dx.doi.org/10.1136/fn.77.1.F52 Infants with severe liver disease have a high urinary excretion of such ketonic bile acids as 3-oxo-Δ4 bile acids (7α, 12α-dihydroxy-3-oxochol-4-en-24-oic acid; Δ4-3-one, 12α-hydroxy-3-oxochola-4, 6-dien-24-oic acid; Δ4,6-3-one).1 A newly identified inborn error in bile acid synthesis, 3-oxo-Δ4-steroid 5β-reductase (5β-reductase) deficiency, is associated with idiopathic neonatal hepatitis syndrome.2 Because the amount of ketonic bile acids was suspected to be related to the severity of hepatic damage, the concentrations of bile acids were examined in various biological fluids obtained from patients with liver disease and from normal controls.1-5 Ketonic bile acids were detected in amniotic fluid collected from healthy pregnant women for studies of bile acid metabolism.3 However, little is known about the amounts of ketonic bile acids present in the biological fluids of neonates and young infants. Our objective was to evaluate qualitatively and quantitatively 3-oxo-Δ4 bile acids in urine, meconium, and faeces from normal neonates and young infants using gas chromatography–mass spectrometry (GC–MS) with selected ion monitoring (SIM). These healthy neonates (mean gestational age: 40 weeks, range 39–41 weeks) (mean birthweight: 3050 g, range 2580–3655 g) showed normal development. Spot urine samples were collected from 24 infants (8 boys and 16 girls), six in each of six age groups: 0, 3, and 7 days and 1, 2, and 3 months. The concentration of individual bile acids in urine was corrected for creatinine concentration (μmol/mmol of creatinine) in each case. Meconium (five specimens) was obtained from healthy neonates (three boys and two girls). Spot faecal samples were obtained from 13 healthy infants (seven boys and six girls) at the following ages: 7 days (n=6), 1 month (n=2), and 3 months (n=5). None of the subjects had a history of, or showed signs of, hepatobiliary or gastrointestinal disease. Infants were breastfed until 2 months of age. Thereafter, the diet consisted of breast milk supplemented with formula or baby food. We synthesised the following agents, as described before: 1β, 3α, 7α, 12α-tetrahydroxy-5β-cholan-24-oic acid (CA-1β-ol), 1β, 3α, 7α -trihydroxy-5β-cholan-24-oic acid (CDCA-1β-ol), 3β-hydroxy-chol-5-en-24-oic acid (Δ5-3β-ol), 7α, 12α-dihydroxy-3-oxo-5β-chol-1-en-24-oic acid (Δ1-3-one), Δ4-3-one, Δ4,6-3-one, and 3α, 7α-dihydroxy-24-nor 5β-cholan-23-oic acid (Nor-CDCA) (6-8). 3α, 7α, 12α-trihydroxy-5β-cholan-24-oic acid (cholic acid; CA), 3α, 7α-dihydroxy-5β-cholan-24-oic acid (chenodeoxycholic acid; CDCA), 3α, 6α, 7α-trihydroxy-5β-cholan-24-oic acid (hyocholic acid; HCA), and 3α, 7β-dihydroxy-5β-cholan-24-oic acid (ursodeoxycholic acid; UDCA) were obtained from Sigma Chemical Co. (St Louis, MO, USA). 3α, 12α-dihydroxy-5β-cholan-24-oic acid (deoxycholic acid; DCA), and 3α-hydroxy-5β-cholan-24-oic acid (lithocholic acid; LCA) were obtained from Wako Junyaku (Osaka, Japan) and from Aldrich Chemical Company, Inc. (Milwaukee, WI, USA), respectively. Each bile acid or mixture of bile acids was converted to the methyl ester by incubation with diazomethane ether (1 ml) at room temperature for 10 minutes. After evaporation, the trimethylsilyl ethers were prepared by heating the residue withN-trimethylsilylimidazole (50 μl) (Tokyo Kasei, Tokyo, Japan) in acetonitrile at 38°C for 60 minutes. Excess reagents were evaporated in an N2 stream, and the residue was dissolved in acetone before GC-MS analysis with SIM. An internal standard (Nor-CDCA) was added to the samples of urine (1 ml) and lyophilised meconium (1 mg) or faeces (1 mg).The latter were dissolved in NaOH, 0.1 mol/l (2 ml). All samples were applied to a Bond-Elut C18 cartridge (6 ml) (Analytichem, Harbor City, CA, USA). The cartridge was washed with 5 ml of water and the bile acid conjugates were eluted with 5 ml 90% ethanol. The solvent was evaporated and the residue was treated with 0.05 M sodium acetate buffer (pH 5.6). Each sample was added to 0.75% (v/v) 2-mercaptoethanol (200 μl) (Sigma), 0.05M EDTA (200 μl) (Sigma), H2O (100 μl), choloylglycine hydrolase (3 units) (Sigma) and sulphatase (50 units) (Sigma) and incubated at 38°C for 16 hours. The sample was then applied to a Bond-Elut C18 cartridge. The cartridge was washed with 5 ml of water, and the bile acid conjugates were eluted with 5 ml 90% ethanol. Free bile acids were extracted with piperidinohydroxypropyl dextran gel (Shimadzu Corp, Kyoto, Japan), eluted with 5 ml of acetic acid in 90% ethanol (0.1 mol/l), and converted to the methyl ester-trimethylsilyl ethers (50 μl) for GC-MS analysis. After conversion of the methyl ester-trimethylsilyl ethers (50 μl) the sample was added to 50 μl of acetone. Next, 1 μ1of the sample was injected into a splitless injection port of the GC-MS system. The mean recovery of unconjugated bile acids was 97.3% (range 78.6–119.8%); the lowest recovery rate was for 3-oxo-Δ4 bile acids (78.6%). After extracting bile acids from urine (using the Bond-Elut C18 cartridge), each sample was applied directly to a column of piperidinohydroxypropyl dextran gel. Steps in the stepwise elution of unconjugated bile acids and taurine-, glycine-, and sulphate-conjugated bile acids were separated by washing with the buffers, eluted with 5 ml of AcOH-AcOK (pH 6.5) in 90% ethanol (0.3 mol/l), of HCOOH in 90% ethanol (0.2 mol/l) and 1% (NH4)2CO3 in 70% ethanol, respectively. After adding the internal standard (Nor-CDCA), hydrolysis of the glycine- and taurine-conjugated fractions was carried out using choloylglycine hydrolase. Sulphate conjugated fractions were solvolysed and hydrolysed using sulphatase and choloylglycine hydrolase. The respective fractions were converted first to unconjugated bile acids and then to methyl ester-trimethylsilyl ether derivatives, then analysed by GC-MS with SIM. GC-MS was performed using a Hitachi-M-80B instrument equipped with a data processing system (Hitachi M-0101; Hitachi Ltd, Tokyo, Japan). A Megabore DB-1 GC capillary column (25 m by 0.53 mm, internal diameter, glass coil; J and W Scientific, Folsom, CA, USA) was used. The temperature of the column oven was programmed to rise from 200 to 290°C at 18°C/minute; the temperature of both the injection port and the detector was 260°C. The flow rate of helium gas was 25 ml/minute. Ionisation energy was set at 70 eV, multiplier voltage at 1400 V, acceleration voltage at 3000 V, source slit at 400 μm, and collector slit at 300 μm. The GC-MS data for bile acid derivatives and related compounds are summarised in table 1. Figure 1 shows a chromatogram obtained using SIM of the characteristic fragments of the methyl ester-trimethylsilyl ether derivatives of a mixture of reference bile acids. GC-MS data for methyl ester-trimethylsilyl ether derivatives of bile acids Mean (SD) bile acids in urine of healthy infants Mean (SD) bile acids in faeces of healthy infants Selected ion GC-MS chromatogram of the methyl ester-trimethylsilyl ether derivatives of a mixture of reference bile acids:1 Nor-CDCA; 2 LCA; 3 DCA; 4 Δ5-3β-ol; 5 CDCA; 6 UDCA; 7 HCA; 8 CA; 9 Δ1-3-one; 10 3-oxo-Δ4bile acids; 11 CA-1β-ol; 12 CDCA-1β-ol. We obtained calibration curves for the determination of bile acids by plotting the peak area ratios corresponding to the monitored ions for each bile acid and the corresponding internal standard versus the amount of each bile acids. A linear relation (r > 0.976 to 0.997) was obtained over a range of 1.5 to 10 ng for each bile acid. Because Δ4-3-one and Δ4,6-3-one had the same retention time (22.2 minutes) and the same fragment ions (267 382 mass:charge ratio; m/z), we could not tell one from the other. We therefore mixed them in equal amounts and expressed the result of the analysis as 3-oxo-Δ4 bile acids. These chromatographic responses are appropriate for the assay of bile acids in urine, meconium, and faeces with the addition of adequate amounts of internal standard. Data are reported as mean (SD). One-way ANOVA was used to determine the significance of differences between groups. Comparisons of groups of categorical data were made using Student's t test. A P value of less than 0.05 was accepted as significant. URINE (TABLE 2) The highest mean total bile acids (TBA):creatinine ratio (19.5 (24.8) μmol/mmol of creatinine) was observed in the urine of 7 day old infants. The mean value decreased gradually thereafter. The TBA:creatinine ratio at 3 and 7 days of age significantly exceeded that at 3 months of age (P<0.05) and the TBA:creatinine ratio at 3 and 7 days of age significantly exceeded that on the first day of life (P<0.05). Urinary 3-oxo-Δ4 bile acids (Δ4-3-one, Δ4,6-3-one) were detected in the urine at each test period. The percentage of this component in TBA in neonatal urine at birth was significantly higher than that at 3 and 7 days and at 1 and 3 months of age (P<0.01, P<0.05, P<0.05, and P<0.05, respectively). The mean percentage of 3-oxo-Δ4 bile acids in TBA did not differ significantly after 3 days of age. The concentration of this component in urine did not differ in each period. The mean Δ1-3-one:creatinine increased between days 1 and 3 of postnatal life. The mean Δ1-3-one:creatinine at 3 or 7 days of age significantly exceeded that at 2 or 3 months of age (all P<0.05). The mean percentage of Δ1-3-one to TBA excretion on the first day of life and at 3 days of age significantly exceed that at 1, 2, or 3 months of age (P<0.001, P<0.001, and P<0.001, respectively). The mean percentage of Δ1-3-one to TBA excretion at 7 days of age significantly exceeded that at 1, 2, or 3 months of age (P<0.05, P<0.05, and P<0.05, respectively). We analysed such polyhydroxylated bile acids as 1β-hydroxylated bile acids (CA-1β-ol, CDCA-1β-ol) and 6α-hydroxylated bile acid (HCA), and unsaturated bile acid (Δ5-3β-ol). The non-ketonic fetal bile acids (CA-1β-ol, CDCA-1β-ol, HCA, Δ5-3β-ol):creatinine ratio in urine remained increased from 3 days to 1 month, then decreased gradually thereafter (P<0.05vs 3 months). The ratio of non-ketonic fetal bile acids:creatinine in urine on the first day of life was significantly lower than that at 3 days, 7 days, and 1 month of age (P<0.01, P<0.05, and P<0.01, respectively). The mean percentage of non-ketonic fetal bile acids in TBA was lowest on the first day of life (18.6 (5.9)%) and increased gradually thereafter. These non-ketonic fetal bile acids in urine exhibited a similar percentage at 7 days, 1 month, 2 month, and 3 months of age (7 days, 41.4(16.3)%; 1 month, 44.7(11.4)%; 2 months, 46.0(10.0)%; 3 months, 43.9 (5.4)%). The urinary concentration of the usual bile acids (CA, CDCA, DCA, LCA) was lowest on the first day of life and increased gradually thereafter (P<0.05 vs 3 days, P<0.01 vs 1 month). The mean percentage of the usual bile acids in TBA on the first day of life and at 3 days of age differed from that at 1, 2, or 3 months of age (P<0.001, P<0.05, and P<0.01, respectively). The mean percentage of the usual bile acids in TBA at 7 days of age differed from that at 1 or 3 months of age (P<0.01 and P<0.05, respectively). The mean UDCA:creatinine ratio in urine on the first day of life and at 3 and 7 days of age was significantly lower than that at 1 month of age (P<0.01). The mean UDCA:creatinine ratio on the first day of life and at 3 days of age was significantly lower than that at 3 months of age (P<0.01). The mean UDCA:creatinine ratio on the first day of life was significantly lower than that at 2 months of age (P<0.01). The mean percentage of UDCA in TBA increased gradually after birth. The mean percentage of UDCA in TBA at 3 days of age was significantly lower than that at 1, 2, and 3 months of age (P<0.01, P<0.01, and P<0.001, respectively). UDCA increased gradually after birth. The percentage of conjugated bile acids in the urine was analysed in the first week after birth. Of the 3-oxo-Δ4 bile acids, DCA, LCA, and UDCA, glycine-conjugated bile acids (73, 27.6, 55.8, and 27.1%, respectively) predominated over the other conjugated bile acids. Of the 1β- (CA-1β-ol, CDCA-1β-ol), 6α-hydroxylated (HCA) bile acid, Δ1-3-one and CA, taurine-conjugated bile acids (49.0, 34.4, 49.1, 45.6, and 41.5%, respectively) predominated over the other conjugated bile acids. The sulphate conjugated bile acids predominated over the other conjugated bile acids in Δ5-3β-ol and CDCA (42.0 and 47.5 %, respectively) (fig2). Percentage of unconjugated bile acids and of taurine, glycine, and sulphate conjugated bile acids in the urine of neonates in the first week after birth. MECONIUM AND FAECES (TABLE 3) The highest value for TBA (mean (SD), 14.2 (6.9) μmol/g) in meconium was observed on the first day of life and the mean value decreased gradually thereafter. The TBA in meconium on the first day of life significantly exceeded that in faeces at 7 days and at 1 or 3 months of age (P<0.01, P<0.05, and P<0.01, respectively). The concentration of 3-oxo-Δ4 bile acids in meconium on the first day of life significantly exceeded that in faeces at 7 days and at 1 or 3 months of age (P<0.05, P<0.05, and P<0.01, respectively). The mean concentration of this component in faeces did not differ significantly after 7 days of age. The concentration of Δ1-3-one did not differ in each period. The mean percentage of Δ1-3-one to TBA excretion at birth was significantly lower than those at 7 days and at 1 or 3 months of age (P<0.05, P<0.01, and P<0.05, respectively). Non-ketonic fetal bile acids were increased in meconium on the first day of life, but decreased gradually thereafter (P<0.01vs 7 days). The mean percentage of non-ketonic fetal bile acids in TBA did not differ significantly after 7 days of age. These non-ketonic fetal bile acids remained at about the same level at 7 days, 1 and 3 months of age (7 days, 31.7 (11.6)%; 1 month, 39.7 (9.3)%; 3 months, 38.3 (11.1)%). The concentration of the usual bile acids in meconium was higher on the first day of life (P<0.05 vs 7 days) and remained at about the same level in faeces at 7 days and at 1 and 3 months of age. The mean percentage of the usual bile acids in TBA did not differ in each period. The mean percentage of UDCA in TBA increased gradually after birth. The mean percentage of UDCA in TBA in meconium on the first day of life was significantly lower than that in faeces at 7 days and 3 months (P<0.05 and P<0.05, respectively). The serum concentration of TBA in healthy neonates significantly exceeds that in children over 1 year of age, a condition called physiological cholestasis.9 The urinary TBA:creatinine ratio was raised in the first week after birth, then decreased gradually. The high concentration of TBA in urine may be attributable to either an enhanced stimulation of the enterohepatic circulation of bile acids or an impaired hepatic clearance or excretion.10The highest value for TBA in meconium was in neonates. This value is greatly influenced by events or conditions during pregnancy, such as the presence of biliary bile in the fetal duodenum or the ingestion of amniotic fluid by the fetus.10 11 Ketonic bile acids are usually considered to result from the bacterial oxidation of primary bile acids.12 In this study we detected ketonic bile acids early in life. The intestine may be colonised by bacterial flora during the first week.13 A high concentration of 3-oxo Δ4 bile acids in serum or urine has been associated with a deficiency in, or a reduction of, 5β-reductase activity, the enzyme which catalyses the conversion of 3-oxo-Δ4 C27 sterol intermediates to 3-oxo-5β products in the normal pathway for primary bile acid synthesis.2 In this study the urinary TBA: creatinine ratio was increased in the first week after birth. The mean percentage of urinary 3-oxo-Δ4bile acids in TBA was significantly higher in the first day after birth than at any other age. This condition reflects the normal development of bile acid metabolism, including the initial immaturity of hepatic enzymes, such as 5β-reductase. After 3 months of age, the urinary TBA:creatinine ratio stabilises as a result of the maturation of liver function and of the enterohepatic circulation.10 At 3 months of age, we detected a small amount of 3-oxo-Δ4bile acids in urine, probably associated with intestinal bacterial flora.13 Healthy infants show small amounts of 3-oxo-Δ4 bile acids in urine, because 3-oxo-Δ4 bile acids absorbed from the intestine are metabolised by hepatic 5β-reductase. Two possible sources of 3-oxo-Δ4 bile acids in urine are bacterial metabolism of cholic acid and side chain oxidation of intermediates in bile acid synthesis. The immaturity of hepatic enzymes such as 5β-reductase is particularly important in early life. Thus our findings of large amounts of unsaturated ketonic bile acids in urine support the results of previous studies.4 5 12 The concentration of 3-oxo-Δ4 bile acids in faeces on the first of life was significantly higher than at any other age. A high mean percentage of 3-oxo-Δ4 bile acids in TBA was also present in the meconium of the neonates and in faeces at 7 days of age. The 3-oxo-Δ4 bile acids found in meconium may originate from amniotic fluid ingested by the fetus during pregnancy.3 Large amounts of 3-oxo-Δ4-steroid intermediate may be produced in the intestine by conversion of primary bile acids into secondary bile acids during the neonatal period.14 3-oxo-Δ4 bile acids may be absorbed from the intestine less easily than other bile acids. Analysis of meconium and faeces is difficult because of disintegration of bile acids, such as 3-oxo-Δ4 bile acids, and/or a low recovery rate, when lyophilised meconium or faeces was dissolved in 0.1 mol/l NaOH in this study. Despite this, we detected large amounts of 3-oxo-Δ4 bile acids in the meconium and faeces. Using our method, Δ4,6-3-one is produced from Δ4-3-one during the preparation of the sample. We therefore suggest that 3-oxo-bile acids are derived to methoximes after the addition of an internal standard to sample. Large amounts of 3-oxo-Δ1 bile acid were excreted in the urine on the first day of life, and again at 3 and 7 days of age. Elimination of water from the 1β-hydroxy structure may be the mechanism for formation of Δ1 bile acids.12It seems likely that 3-oxo-Δ1 bile acids are more readily excreted in the form of hydroxylates.In urine, the percentage of 1β-hydroxylated bile acids in TBA gradually increased after a decrease in the amount of Δ1-3-one.The latter may be converted to CA-1β-ol. The usual bile acids and fetal bile acids showed the same pattern of excretion as we had found before.10 We detected small amounts of DCA and LCA in meconium and urine on the first day of life. These bile acids seemed to be mostly of maternal origin, entering the fetus via placental transfer.15 Interestingly, we detected UDCA in both urine and faeces. This finding may be related to the use of food supplements.4 Their formation is probably linked to mechanisms for bile salt excretion in infants with physiological cholestasis. Bile acids have a pronounced hepatotoxic effect in the fetus unless they are metabolised into more polar compounds. CA and CDCA are transformed by polyhydroxylation into 1β-hydroxylated bile acids.16 The process of 6α-hydroxylation is probably also important in detoxification.17 The urinary data in the present study showed that most of the conjugated bile acids were glycine conjugates (40% of TBA, 73% of 3-oxo-Δ4 bile acids), whereas the fetal bile acids were predominantly taurine and sulphate conjugates. Taurine conjugates constituted 17.4 to 49.1% of each fetal bile acid, and sulphate conjugates constituted 31.1 to 42% of each fetal bile acid. Conjugation increases the polarity of the molecule, thereby facilitating its renal excretion and minimising the membrane damaging potential of the more hydrophobic unconjugated species.18 In conclusion, this study has shown that healthy newborn infants excrete large amounts of 3-oxo-Δ4 bile acids in urine and faeces. The presence of large amounts of this urinary component may indicate an immaturity of hepatic 5β-reductase activity in the first week after birth. After 1 month of age, this urinary component is thought to be derived from 3-oxo-Δ4-steroid intermediates in the intestine. The presence of large amounts of this component in meconium may be due to the ingestion of amniotic fluid by the fetus during pregnancy. This study was supported in part by Grant-in Aid 07670924 for Scientific Research from the Ministry of Education, Science, Sports and Culture of Japan. Clayton PT, Patel E, Lawson AM, Carruthers RA, Tanner MS, Strandvik B, (1988) 3-Oxo-Δ4 bile acids in liver disease. Lancet i:1283–1284. Setchell KDR, Suchy FJ, Welsh MB, Zimmer-Nechemias L, Heubi J, Balistreri WF (1988) Δ4-3-Oxosteroid 5β-reductase deficiency described in identical twins with neonatal hepatitis: a new inborn error in bile acid synthesis. J Clin Invest 82:2148–2157. Nakagawa M, Setchell KDR (1990) Bile acid metabolism in early life: studies of amniotic fluid. J Lipid Res 31:1089–1098. Wahlén E, Strandvik B (1993) Effects of different formula feeds on the developmental pattern of urinary bile acid excretion in infants. J Pediatr Gastroenterol Nutr 18:9–19. Wikström S (1994) The urinary bile acid excretion in healthy premature and full-term infants during the neonatal period. Scand J Clin Lab Invest 54:1–10. Leppik RA (1983) Improved synthesis of 3-keto, 4-ene-3-keto, and 4,6-diene-3-keto bile acids. Steroids 41:475–484. Tohma M, Mahara R, Takeshita H, Kurosawa T, Ikegawa S (1986) Synthesis of the 1β-hydroxylated bile acids, unusual bile acids in human biological fluids. Chem Pharmacol Bull (Tokyo) 34:2890–2899. Kurosawa T (1987) Determination of 3β, 12α-dihydroxy-5-cholen-24-oic acid and related bile acids in human serum by gas chromatography-mass spectrometry. J Chromatogr 421:9–19. Balistreri WF, Heubi JE, Searcy JE, Levin RS (1981) Physiologic cholestasis:elevation of the primary serum bile acid concentration in normal infants. Gastroenterology 80:1037–1041. Kimura A, Yamakawa R, Ushijima K, Fujisawa T, Kuriya N, Kato H, (1994) Fetal bile acid metabolism during infancy: analysis of 1β-hydroxylated bile acids in urine, meconium and feces. Hepatology 20:819–824. Inokuchi T, (1992) Origin of bile acids in meconium: analysis of bile acids in biliary bile, amniotic fluid and meconium [In Japanese]. J Clin Pediatr 40:59–62. Egestad B, Sjövall J (1989) Ketonic bile acids in urine of infants during the neonatal period. J Lipid Res 30:1847–1857. Rotimi VD, Duerden BI (1981) The development of the bacterial flora in normal neonates. J Med Microbiol 14:51–62. Coleman JP, White WB, Sjövall J, Hylemon PB (1987) Biosynthesis of a novel bile acid nucleotide and mechanism of 7α-dehydroxylation by an intestinal Eubacterium species. J Biol Chem 262:4701–4707. Colombo C, Roda A, Roda E, Buscaglia M, Alberto Dell'Agnola C, Filippetti P, (1985) Correlation between fetal and maternal serum bile acid concentration. Pediatr Res 19:227–231. Yuge K, Ono E, (1989) Unusual 1β-hydroxylated bilr acids in children with a paucity in interlobular bile ducts. Clin Chim Acta 185:215–218. Shoda J, Tanaka N, Osuga T, Matsuura K, Miyazaki H (1990) Altered bile acid metabolism in liver disease: concurrent occurrence of C-1 and C-6 hydroxylated bile acid metabolites and their preferential excretion into urine. J Lipid Res 31:249–259. Scholmerich J, Becher MS, Schmidt K, Schubert R, Kremer B, Feldhaus S, (1984) Influence of hydroxylation and conjugation of bile salts on their membrane-damaging properties: studies on isolated hepatocytes and lipid membrane vesicles. Hepatology 4:661–666.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
3,662
Avengers 4 Trailer: Ever since Thanos has snapped his fingers with the Infinity Gauntlet on and killed half the living things in the universe, it has chaos in the mind of fans who can't stop speculating how the sequel is going to be. Now the makers have released the much awaited trailer of Avengers 4. Yeah all the rumblings seem to be true. A new Captain Marvel trailer coming Monday night, and hold on to your seats because Wednesday more Marvel goodness should be hitting. Avengers 4 trailer on Wednesday morning. We're so close. Just a little longer. Meanwhile, Captain Marvel's second trailer was released today. Captain Marvel will hit the screens in March next year, while Untitled Avengers movie (Avengers 4) will be released in May.
{ "redpajama_set_name": "RedPajamaC4" }
6,213
As a Customer Care Representative II, you will provide a high level of customer service to our Business-to-Business (B2B) Customers. You will process and manage routine customer inquiries and orders via email and phone following established protocols. You will maintain business relationships with internal functions and an assigned pool of Customers. You will escalate customer issues appropriately within the organization and maintain responsibility for follow-through to ensure satisfactory resolution. A highly motivated, team player who strives to consistently provide a superior customer experience is required in this position. Requires ability to navigate computerized systems for tracking, information gathering, and/or troubleshooting. Requires proficient knowledge of the organization, products, and services. Company is an Equal Opportunity Employer. Offers of employment are contingent upon satisfactory completion of a reference check, background check, drug/alcohol test, and documented proof of work authorization. In addition, some roles require a pre-employment medical examination to determine your ability to perform the essential duties of the job.
{ "redpajama_set_name": "RedPajamaC4" }
8,199
Q: how to write a function that calculates the age category I need to write a function that calculates the age category, so this is the function : def age_category(dob_years): if dob_years < 0 or pd.isna(dob_years): return 'NA' elif dob_years < 20: return '10-19' elif dob_years < 30: return '20-29' elif dob_years < 40: return '30-39' elif dob_years < 50: return '40-49' elif dob_years < 60: return '50-59' elif dob_years < 70: return '60-69' else: return '70+' I checked the function it works but when I try to create a new column : credit_scoring['age_group']= credit_scoring.apply(age_category, axis=1) I have this error : TypeError: '<' not supported between instances of 'str' and 'int' actually, i am new in python i don't know what to do pls help what is wrong with the code ? thanks for your time :) A: def age_category(dob_years): if not isinstance(dob_years, (float, int)): try: dob_years = int(dob_years) except ValueError: return 'NA' if dob_years < 0: return 'NA' return { 0: '0-9', 10: '10-19', 20: '20-29', 30: '30-39', 40: '40-49', 50: '50-59', 60: '60-69', 70: '70+', }[10 * int(dob_years // 10)] A: You can achieve your goal more easily using pd.cut. First of all, the sample data: >>> df = pd.DataFrame([0, 18, -3, 73, 17, 88, 60, 1, 20, 14], columns=["age"]) >>> df age 0 0 1 18 2 -3 3 73 4 17 5 88 6 60 7 1 8 20 9 14 Then you need to prepare the bins and their labels: >>> from math import inf >>> bins = list(range(0, 80, 10)) >>> bins.append(inf) >>> bins [0, 10, 20, 30, 40, 50, 60, 70, inf] >>> labels = [f"{i}-{i + 9}" for i in bins[:-2]] >>> labels.append(f"{bins[-2]}+") >>> labels ['0-9', '10-19', '20-29', '30-39', '40-49', '50-59', '60-69', '70+'] Once you have them, use pd.cut with right=True so it will assign labels according to your example. >>> df["age group"] = pd.cut(df["age"], bins=bins, labels=labels, right=False) >>> df age age group 0 0 0-9 1 18 10-19 2 -3 NaN 3 73 70+ 4 17 10-19 5 88 70+ 6 60 60-69 7 1 0-9 8 20 20-29 9 14 10-19
{ "redpajama_set_name": "RedPajamaStackExchange" }
7,966
Good quality To start with,and Purchaser Supreme is our guideline to offer the top service to our customers.Presently, we've been seeking our best to be amongst the top exporters inside our industry to fulfill consumers extra need to have for Next Coin Purse , next coin purse , Coin Purses , credit assures cooperation and preserve the motto within our minds: prospects first. "We support our purchasers with ideal premium quality products and substantial level company. Becoming the specialist manufacturer in this sector, we've acquired rich practical working experience in producing and managing for Next Coin Purse , next coin purse , Coin Purses , We'll supply much better products with diversified designs and expert services. We sincerely welcome friends from over the world to visit our company and cooperate with us on the basis of long-term and mutual benefits.
{ "redpajama_set_name": "RedPajamaC4" }
5,931
Prior to the Tax Cuts and Jobs Act (TCJA), taxpayers were able to deduct fifty percent of the cost of meal expenses and entertainment expenses provided certain requirements were met. As a result of the TCJA, entertainment expenses are no longer deductible. The issue presented for taxpayers has now become whether the provision of food and beverages might constitute entertainment. For most taxpayers, the 2017 Tax Cuts and Jobs Act (TCJA) should provide a welcome reduction to their tax bills. Taxpayers should see the act's immediate impact on their tax bills as they work to finalize and file their 2018 tax returns. Attorneys Robert T. Smith and Blake D. Lewis presented at the Arkansas Society of CPA's 57th Annual Arkansas Federal Tax Institute. Additionally, Blake served as co-chair to the two-day event. Robert spoke about reasonable compensation and Blake's presentation focused on the centralized partnership audit regime.
{ "redpajama_set_name": "RedPajamaC4" }
1,578
After Tom Clancy's passing this week, fans of his biggest character, Jack Ryan, have been introduced to a new phase of the movie incarnations of his geopolitical thriller novels with this weekend's release of the first trailer for Jack Ryan: Shadow Recruit. Last December we at borg.com listed Jack Ryan as one of the ten characters to watch in 2013, and we included Jack Ryan, the movie, as one of the 24 films we predicted would be worth seeing in 2013. Since last year's announcement of Chris Pine taking on the lead, the title was changed to add the subtitle Shadow Recruit, replacing the prior subtitle Shadow One (we think Hollywood really needs to work on their subtitles). The role of Jack Ryan was, of course, first played by Alec Baldwin in The Hunt for Red October in 1990, followed by Harrison Ford in Patriot Games in 1992 and Clear and Present Danger in 1994. Ben Affleck then played a younger Jack in the 2002 prequel film The Sum of All Fears. All four of these movies were based on bestselling Clancy novels, The Hunt for Red October often being listed as one of the best thrillers of all time. Likewise, The Hunt for Red October is one of the best, and most exciting, movies of all time, with Alec Baldwin's performance still the standard for future Ryans to be measured against.
{ "redpajama_set_name": "RedPajamaC4" }
4,087
Irish Montrealers push for memorial to 6,000 immigrants who died of typhus Many of the people taking the Victoria Bridge in or out of Montreal may not realize they're driving over a mass graveyard. Francis Braddeley of the Erin Sports Association sings the Irish national anthem during the annual walk to the Black Rock at the foot of the Victoria Bridge in Montreal in 2016. Peter McCabe / Montreal Gazette Many of the people taking the Victoria Bridge in or out of Montreal may not realize they're driving over a mass graveyard. A 10-foot-tall engraved stone, placed on a median between the lanes of traffic, announces that the site is the resting place of some 6,000 Irish immigrants who died of typhus in "fever sheds" along the riverbank in 1847-48 after fleeing famine in overcrowded ships. The stone, stained black from exhaust fumes, sits in a little-visited industrial zone near the foot of the bridge, and some members of Montreal's Irish community say the city needs to do a better job of honouring the chapter of Canadian history it represents. "This is the largest single burial site of the Great Hunger in the world outside of Ireland itself," said Victor Boyle, one of the directors of the Montreal Irish Memorial Park Foundation. "It's also the first memorial to that event outside of Ireland." But he says that while cities like Toronto have prominent memorials to their Irish ship fever victims, Montreal's much-larger number of dead are going unrecognized. On Sunday, about 100 members of the Irish community took part in an annual walk to the site. The ceremony, led by the Ancient Order of Hibernians, has taken place in some form or other since 1865 — six years after the stone was erected by mostly-Irish Victoria bridge construction workers who stumbled across the graves. Laying the monumental stone, marking the graves of 6,000 immigrants near Victoria Bridge: The Irish immigrants who settled in Griffintown in the 19th century were a source of cheap labour, so much so that many of them were hired to work on large-scale construction projects such as the digging of the Lachine Canal, which opened in 1825, and the construction of the Victoria Bridge, inaugurated in 1860. Still, life in Griffintown was difficult as a result of frequent floods, major fires, unsanitary housing conditions and pollution from the surrounding factories. Moreover, the quarter was struck by several epidemics. In 1846-1847, 6,000 British subjects, for the most part Irish immigrants living in Griffintown, died of typhus. In 1860, workers on the Victoria Brige erected a monument in their memory. 1860 – Ink on paper – Wood engraving Now, Boyle's foundation is trying to get permission to transform a parking lot adjacent to the site into a memorial park in time for Montreal's 375th anniversary in 2017. Boyle says the park would honour not only the Irish victim but also the Montrealers who risked their health and safety to help them, ranging from clergy members to British soldiers to Montreal's mayor, John Easton Mills, who contracted typhus and died in 1847 after visiting the fever sheds. He also wants to salute the many Québécois families who adopted Irish orphans into their families. He says Montreal's mayor, Denis Coderre, has met with the park foundation on two occasions and expressed support for the project. The group will also meet in the coming days with the federal Crown corporation that oversees the vacant lot they're hoping to transform. Boyle said the group won't rest until there's a "meaningful" homage in place. "All these decades later, and we're still having ceremonies here," he said. "That shows we're never going to forget." By Morgan Lowrie STM to hike fares on July 1 Quebec milk producers heading to Ottawa by tractor Drowning takes seconds, but most Quebecers don't supervise children: poll
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
4,091
← Homeless in Alameda Campaign quiz → Posted on February 5, 2019 by Robert Sullwold Tonight, Council will hold a special meeting at which the four finalists vying to be selected to develop the "West Midway" site at Alameda Point will make presentations. Slideshows of proposed designs are banned. Instead, expect to hear the presenters – and those on Council – blather on about how any development on the site should be "vibrant," "robust," "sustainable," etc. But what the Merry-Go-Round will be listening for – and what we were looking for as we read the written responses to the request for qualifications – is evidence of the developer's ability to deliver, on schedule and on budget, a project whose economics are challenging, to say the least. The economics are daunting primarily because the developer will be required to construct not just the backbone infrastructure for the 22.8 acres on which it will build its own 291-unit project but all of the backbone infrastructure for the entire 34-acre area south of Midway Avenue between Pan Am Way and Main Street. This is necessary to make it feasible for a non-profit developer to build a 267-unit supportive and affordable housing facility on the 9.7 acres next to the West Midway site. Constructing this much infrastructure will not be easy – or cheap. As the RFQ disclosed, all of the utilities within the 32 acres "are old and deteriorated and will need to be replaced." At the City's request, a civil engineering firm prepared a "preliminary cost estimate" for the area-wide backbone infrastructure work, which was attached to the RFQ. The total tab came to $56.9 million. But the RFQ went on to caution that this estimate was "for informational purposes only" and "not a guarantee." In addition, the City has imposed a couple of other requirements that adversely affect the economics for a developer, one on the expense side and the other on the revenue side of the ledger. First, the developer will be required either to adopt the City's "Project Stabilization Agreement" for public-works contracts or enter into its own project-labor agreement with the construction trades unions. By whatever name, this type of agreement makes it difficult to employ non-union workers on a project, which results in an increase in labor costs of between 5 and 15 percent (to use the range cited by former City Manager John Russo). Second, the RFQ specified that the 291 new housing units must include 31 "moderate-income" units (defined as housing affordable by households earning between 80 and 120 percent of area median income) and a minimum of 26 "workforce" units (defined as housing affordable by households earning between 120 and 180 percent of AMI). The former requirement is derived from the settlement agreement between the City and Renewed Hope governing all new residential development at Alameda Point. The latter arises from the "specific plan" for the Main Street Neighborhood (of which the West Midway site is a part) adopted by Council in March 2017. Taken together, these requirements will ensure that the West Midway project will not become an exclusive enclave for the well-off. But they also mean that the developer will not be able to charge top dollar for 57 of the 291 units, thereby diminishing project revenue and decreasing market value. From our perspective, these economic challenges make it especially important for the decision-makers to scrutinize, very carefully, the track records of the four finalists. Among other things, we would want to know not just how many multi-family residential projects the developer has built, but how many of these projects included below-market-rate housing (and how much) or were saddled with project labor agreements. Most importantly, we'd want to know whether the developer ever has been responsible for completely replacing the backbone infrastructure in the area in which its project was located. If so, was the work done on schedule and on budget? Of the four finalists, the one about which Alamedans already know the most is Alameda Point Partners, the developer for the 800-unit Site A project for which ground was broken last May and infrastructure construction now is under way. For better or worse, this means that APP's track record is right out front for all to examine. Both the configuration and the timing of the Site A project have changed since APP got the deal. For example, the original proposal called for 72 moderate-income units to be interspersed among the market-rate units. But in July 2017, APP told Council that this plan was no longer "feasible," and it requested an amendment to the DDA allowing it to move 70 of the 72 moderate-income units out of the market-rate buildings and into a separate building, not part of the original scheme, that would be reserved for housing for teachers and other employees of the Alameda Unified School District. Council approved the amendment, but it is worth asking what this change signifies about APP's capacity to deliver the moderate-income units in the West Midway project. Moreover, under the original DDA, Phase 1 of Site A was scheduled to close in December 2016, at which time the City would transfer title to the land to APP. The backbone infrastructure work was supposed to begin 30 days after closing, but APP asked for, and received, three extensions of the closing date, the final one to April 9, 2018. For each extension, APP offered good reasons – or at least reasons deemed satisfactory by Council – but, again, it is worth asking how APP intends to prevent a similar delay in commencement of the infrastructure work if it gets the West Midway contract. We don't mean to pick on APP, and maybe the events we've identified could be regarded as a "learning experience" that cuts in APP's favor. In any case, only one of the other four finalists appears to have performed infrastructure work akin to that which will be required of the West Midway developer: Catellus, which developed the Bayport and Alameda Landing projects on the site of the former Fleet Industrial Supply Center on the northern waterfront in Alameda and, according to its response to the RFQ, is installing infrastructure at Mission Bay in San Francisco that includes new stormwater and sewer systems. (Another finalist, Brookfield Residential, cites in its response to the RFQ a project involving the relocation of a U.S. Army Reserve base in Dublin and construction of a "master-planned community" on the vacated site, but the scope of the infrastructure work is unclear.) We'd like to hear more on this subject from both of these firms. Given the economic challenges facing the developer of the West Midway site, we think it is also vital to pin down the finalists on their ability to finance the project from pre-development through construction to certificate of occupancy. The last thing the City – and advocates for supportive and affordable housing – need is for Council to approve a developer based on a presentation that pushes all the right buttons – and then see nothing happen on the site because the chosen firm can't come up with the dough to start, much less finish, the job. One Del Monte warehouse project is quite enough. As it happens, all of the finalists for the West Midway project have solid financials: APP is backed by Trammell Crow, which, according to APP's response to the RFQ, has assets under management exceeding $15 billion across the United States. Catellus is owned by TPG Capital, which has more than $70 billion of assets under management around the world. The Jamestown/Cypress Equity Investments partnership is made up of Jamestown, which has more $10 billion of assets under management, and Cypress Equity Investments, which has amassed a real-estate portfolio worth $5 billion. Brookfield Residential is – relatively speaking – the piker in the group. Its total assets at the end of 2017 amounted to a mere $4.2 billion. Financial credentials like these may earn the four finalists the enmity of the City of Alameda Democratic Club ideologues eager to rail against "global conglomerates," but they show that each of the developers has the resources to put its own cash into the West Midway project. It would be useful to know, however, just how much of an equity stake each finalist intends to take, since the more a developer is willing to put at risk, the less likely it is to walk away. (We note that, by the time ground was broken at Site A, APP had sunk $15 million in equity funds into the project.) But equity is only one element of project financing, and each finalist also needs to be asked: Where's the rest of the money going to come from? We'd be particularly wary of any developer who proposes to rely on public, rather than private, financing for the West Midway project. One way to pay for infrastructure is to create a community facilities district, which then issues bonds to be repaid by taxes assessed against subsequent property owners. The problem is that this approach, by increasing property taxes on new housing units and commercial buildings, makes them more expensive – and thus less desirable – to potential buyers or renters, which in turn may compromise the marketability of the project. Indeed, the RFQ cautions that the City will authorize formation of a CFD designed to pay infrastructure costs only of it "determines that the resulting total annual tax burden on the property . . . is not unduly burdensome." Of the four finalists, Catellus appears to have depended most heavily on public sources of funds. Its response to the RFQ stated that, for both Alameda Landing and Mission Bay, infrastructure costs were "reimbursed through public financing." For another project located in Mueller, Texas, a special district similar to a CFD was an "essential tool in funding the development." By contrast, the only CFD created – so far – for Site A is the one established to shoulder the annual costs of the transportation demand management program and related items. But private financing is not always easy to get. Indeed, when APP sought extensions of the Site A closing date, it usually cited the need for additional time to "secure" and "finalize" funding commitments from third parties. What funds are available from what sources, and at what price, for the West Midway project depends on a host of factors affecting real-estate investments in particular and the economy in general. In their responses to the RFQ, none of the developers delved into the financing for any of its prior projects, but we'd be interested in getting more details about where the developer got the money – and whether it thinks it can go back to the same well again. It surely would be nice – although highly improbable – if one of the finalists was able to offer the same assurance Carmel Partners did when it was pitching the North Housing project to Council: "Our money is already in place!" We have to confess that we are not confident that all, or perhaps any, of our questions will get asked tonight. The staff report directs Council's attention to three "important considerations" in selection of a developer: "diversity of developer base," "commercial vision," and "job creation/training." None of these has any bearing on the developer's ability to deliver a project on schedule and on budget. Moreover, at least a couple of our Council members probably consider the developer's financial resources and management skills less important than its willingness to build a LEEDS-certified transit-oriented development with an ample supply of bike racks. But if the practical issues are ignored entirely, the City will be proceeding at its peril. Site A: 2017-01-17 staff report re amendment to Site A DDA; 2017-01-17 staff report re amendment to Site A DDA; 2018-03-06 staff report re Site A DDA amendment Main St. Neighborhood: 2017-10-24 Ex. 1 to staff report to P.B. – Main Street Neighborhood Specific Plan West Midway project: 2019-02-05 staff report; 2019-02-05 Ex. 1 to staff report – Request for Qualifications; 2019-02-05 Ex. 4 to staff report – Statement of Qualifications This entry was posted in Alameda Point, City Hall, Development, Housing and tagged Alameda Point Partners, Brookfield Residential, Catellus, Cypress Equity Investments, Infrastructure, Jamestown, Main St. Neighborhood, Trammell Crow Residential, West Midway. Bookmark the permalink. 4 Responses to Battle of Midway NYBORN2012 says: Solid observations and information here. Council members are about gaining popularity and attention from Sacramento politicians, thus your questions are far to relevant for the council to ask. If Frank were still on Council, he would ask them. Maybe Tony will this time? Catellus has supported the Alameda community in many ways over the past few years and has invested in many programs and non profit organizations here. All things considered, I'd lean in their favor. Steve Gerstle says: You raise some interesting questions, Robert; however, I disagree with this conclusion, "But if the practical issues are ignored entirely, the City will be proceeding at its peril." What would be the greater peril for their political careers, approving a project that meets political standards that does not get built, or approving a project that does not meet political standards but does get built? To rephrase Calvin Coolidge, "The business of America is politics." Steve, i Don't think he means the Council, but the City itself. The Del Monte has been tied up for decades by developers who cannot seem to develop it. Mike McMahon says: The last iteration of developing the Del Monte project has so many pre-conditions that the developer ran out of money to develop the actual project, which I believe is the point of this current post.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
1,254
GARDEN GROVE, Calif.– C3 International, a biopharmaceutical company that has played a leadership role in the emerging cannabinoid therapeutics health sector, officially announced the launch of its flagship Idrasil pain management pill. Idrasil is the first standardized form of medical cannabis. It offers all of the medicinal analgesic and therapeutic benefits of cannabis, but is a superior alternative to opiates and life-threatening narcotics because physicians and caregivers can provide patients with safe, non-addictive, measurable dosages. Conditions treated with Idrasil include, but are not limited to, AIDS; anorexia; arthritis; autism; anxiety/depression; cancer; chronic pain; glaucoma; migraines; persistent muscle spasms; Parkinson's; seizures; severe nausea; Tourette's Syndrome, and any other chronic or a persistent medical symptoms that substantially limit major life activities as defined in the Americans with Disabilities Act of 1990. Idrasil also offers relief for patients struggling with clinical endocannabinoid deficiencies (CECD). CECD is a serious disorder that has a proven link to various disorders such as migraines, Fibromyalgia, Irritable Bowel Syndrome (IBS), and other treatment-resistant syndromes. Patients that want natural relief from CECD-related ailments, seek to reduce or stop existing pharmaceutical or over-the-counter dependencies, and want to curb the negative side effects or addictions of opiates and narcotics can benefit from Idrasil. Idrasil consists of a proprietary blend of concentrated cannabis extract that is 100% natural and organic. C3's proprietary process isolates all of the cannabinoids from the cloned cannabis plant, resulting in pure natural extraction in pill form to eliminate the unwanted euphoria and social risks associated with smoking marijuana and unpredictable dosages of edible confections. Idrasil is a natural product that looks like any pill on the market. Additional ingredients include calcium carbonate, magnesium stearate, dicalcium phosphate, microcrystalline cellulose and silicon dioxide, which are common tabulation additives in both foods and pharmaceuticals, and are gluten free.
{ "redpajama_set_name": "RedPajamaC4" }
9,701
When you ban somebody (this is also known as "blocking"), you will not see their posts and comments anywhere. They will not see your posts anywhere. They will see your comments in other people's posts (but you won't see their replies). List of people you've banned is available at Notifications page. If you're the group admin, and you banned somebody, you will still see their posts and comments in that group. If you banned group admin, they will see your posts and comments in their group. When you ban somebody, they are quietly unsubscribed from your feed, and you are unsubscribed from them. They cannot subscribe back and cannot request subscription to your feed. On your "Settings" page, you can specify YouTube video or link to image file, and it will be shown to everyone who is banned by you. When you hide somebody, you will not see their posts in your homefeed, and in aggregate pages such as "Best of" and "Everything". You will still see their comments. List of people you've hidden is available at Notifications page. When you hide a post, it is not visible anywhere, except for direct link to the post. You can see list of posts you have hidden on "Hidden entries" page.
{ "redpajama_set_name": "RedPajamaC4" }
8,564
Starting today, Dec. 4th and until Saturday, dec. 8th, the first installment in the New Age Lamians trilogy by amazing author Didi Oviatt will be on promotion … for free on Amazon. But before I entice you with the blurb and an excerpt, let me extend Didi a (very late) Congratulations on her (fairly) new contract with creativia Publishers. Congratulations, Didi, it's very well deserved! Dr. Brooklyn abruptly stopped at a solid metal door and turned to face me. "Now Jack. I know you're scared but I assure you everything is for the greater good and will turn out just fine in the long run. I'm going to need you to do everything I tell you to without question. You'll be restrained and it's for your own good. Please do not fight it, Jack. That could be a very large and painful mistake on your part." I stared at her wide eyed and cautious. She leaned in and gave me a quick unforgettable kiss on the cheek before shoving the door open. My eyes wandered from one end of the room to the other. I'm not sure exactly what I expected, but I do know that this was not it. The center of the room contained a large see through tank of some clear liquid. Surrounding it was screen after screen of some technology that I was yet to be familiarized with. With a person standing next to each machine. I instantly recognized the fact that they'd been waiting for me. Dr. Brooklyn pointed around the room at all of the people. As she spoke, two men walked up to me and started stripping me of the comfort of my gray sweatsuit. My muscles clenched and I flinched in embarrassment. No one besides my father had seen me naked, and even when I changed in front of him he would look away. Not since I was a baby being dressed by my mother had I been in the buff around a woman. She passed away when I was 5, God rest her soul. I looked down in shame remembering the warning not to fight. I wanted more than anything to grab my clothes and run, but not before knocking a few of these Company wimps to the ground. I held in the notion and wearily looked back up at Dr. Brooklyn. She explained to me that I would be lowered into the liquid. She told me to breath just the same once I was emerged as I do on a regular basis. As she gave me instructions, the same two men that had stripped me of my clothes, fastened tight cuffs around my wrists, ankles, and waist. The cuffs were secured to a couple of bars to my sides and above my head by thin yet obviously indestructible cables. Before I knew it I was being hoisted up into the air. The metal bars were strong, holding me in place with my feet shoulder width apart and my arms extending out to my sides. The doctor no longer made any explanations. She turned her back to me and started spitting out orders to the rest of the room. I watched from above as the distance between myself and the floor exceeded. I was lowered into the tank feet first. The liquid was a thick, warm gel. I wiggled my toes feeling the slime slip and then mold into place between them. By the time I was waist deep I could no longer move my feet. I became motionless and without feeling as I descended to my fate. Further and further down I was engulfed in the shocking warm fluid. Past my stomach and to my neck. I took in shallow breaths as the gel touched my chin. By leaning my head back I tried to keep my mouth out for as long as I could. The wet warmth engulfed the remainder of my flesh. I held my breath and opened my eyes. Everything was a blur, I could see the silhouette of each body in the room yet I could not tell apart who was man and who was woman. I peered out of the tank in fear searching for Dr. Brooklyn, but with no success. I remembered her words to breath the liquid as if I was breathing regular air. I was now completely motionless from head to toe. I couldn't feel a thing aside from the weakening pound of my heart. The lack of oxygen was dulling my mind as I continued to hold my breath. After a few more seconds I could no longer take it. I felt as if my chest was going to explode leaving my body dead in this disgusting paralyzing gel. I couldn't take it any longer. Unable to part my lips I let the air out through my nostrils and took in a giant breath. Rather than air filling my lungs I felt the slime slowly making its way down my nasal cavity and into my chest. Strangely I felt as if I had been breathing it all my life. Oxygen was flowing through my body and into my brain as if I was breathing the natural fresh air I had taken for granted on an otherwise normal day. No sooner than my body began to relax the pain struck. The first needles were injected into the backs of my knees. I could feel each one, long and sharp as they pushed through my skin and into the joints of my legs. I tried with all my strength to move away from the pain but I was rendered motionless. The molded gel held me in place as the same pain was now being shoved into the insides of my elbows. Followed by my neck and lower back. I tried to move away and escape the torture, but I could not. The mold held me in place as all six needles made their way further and further painfully into my flesh. The excruciating pain of the serum was then pushed into my body though these burning needle tips. Thick like hardening sap on a pine tree. I felt it burn its way into my veins. Engulfing my body in shock and fire. Every molecule hurt as the fibers attached themselves throughout me. It spread like a slow plague infecting my body with smoldering hot lava. Though motionless on the outside I was screaming and thrashing on the inside. With every beat of my heart the burning pressure was pushed from one end of my body to the other. Reaching my brain last. I felt as if my head was exploding into a thousand pieces. Like it was being ripped apart by rabid wolves. I screamed at the top of my lungs yet was heard by no one, as the sound was masked by the disgustingly thick gel I was being held in. My mind darkened as it clouded over by a deep gray followed by an ultimate darkness. Welcoming the illumination, my mind slipped into a much needed unconscious state. Sounds wonderful, Jina, and congrats to Didi! Thank you so much Jina! And, congrats to you with the recent contract too! What an exciting year for so many of us. May 2019 continue to offer as wonderful of all things bookish for both of us! Cheers, my friend! Previous PostPrevious Have you met Boredom and his twin, Time?
{ "redpajama_set_name": "RedPajamaC4" }
7,532
J.J. Abrams Returns To Write And Direct 'Star Wars: Episode IX' 09/12/2017 11:11 am ET Updated Sep 12, 2017 The return of the J.J. By Bill Bradley UPDATE: 4:00 p.m. ET — In addition to the director news, "Star Wars" announced that the premiere date for "Episode IX" will be Dec. 20, 2019. Star Wars: Episode IX is scheduled for release on December 20, 2019. pic.twitter.com/rDBqmuHX89 — Star Wars (@starwars) September 12, 2017 The Force was with J.J. Abrams when he launched the new set of "Star Wars" films with "The Force Awakens," so now Disney is bringing him back. As Deadline reported on Tuesday, and according to a press release on StarWars.com, Abrams will return to write and direct "Star Wars: Episode IX." The statement reads: A post shared by Star Wars (@starwars) on Sep 12, 2017 at 7:28am PDT After Disney unexpectedly parted ways with former "Episode IX" director Colin Trevorrow earlier this month, rumors that Rian Johnson, who is directing "Star Wars: The Last Jedi," would take over surfaced. But Deadline reports that Johnson decided not to take the offer to direct. On Abrams' hiring, Lucasfilm President Kathleen Kennedy said, "With 'The Force Awakens,' J.J. delivered everything we could have possibly hoped for, and I am so excited that he is coming back to close out this trilogy." After what we saw in "Force Awakens," we're pretty excited about it, too. We just hope they call it "The Return of the J.J." There Were 2 Royal Moments You Might Have Missed At Biden's Inauguration Joe Biden Removed Trump's Diet Coke Button, Twitter Bubbled Up With Jokes Princess Charlene Defends New Buzzcut Hairstyle: 'It Was My Decision' Katy Perry Closes Out Biden's Inauguration Celebration With A Literal Bang 'Star Wars' Postage Stamps Entertainment Editor, HuffPost Movies Star Wars J.J. Abrams
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
0
Aalst-Erembodegem, November 7, 2018 - Ontex Group NV (Euronext Brussels: ONTEX; 'Ontex,' 'the Group' or 'the Company') today announced its results for the nine-month period ended September 30, 2018. Hygiene markets remained broadly stable compared to a year ago. In the third quarter, babycare category value decreased at a similar rate to the first half, however with a modest improvement in market pricing while volumes were lower. The proportion of retailer brands in our markets were at similar levels versus last year, highlighting their inherent attractiveness to retail customers as well as consumers. On a year-on-year basis, most commodity raw material indices were higher, in some cases materially, in Q3 2018. Based on current information this is expected to continue in Q4 2018. The majority of currencies we do business in were lower versus the euro during Q3 2018, continuing the trend seen in the first half of the year. Based on our year-to-date revenue performance, we remain on track to record low single-digit LFL growth for the full year 2018 in broadly flat hygiene markets. Actions to enhance price/mix and cost saving initiatives will continue and further value-creating actions will be considered as required, in light of sustained input cost pressures and FX volatility. We expect Q4 2018 Adjusted EBITDA margin to be broadly similar to that in 9M 2018. Babycare category revenue saw a slight decrease of 0.5% in Q3 2018. Baby pants revenue accelerated further in Q3, up double digits supported by growth in all main markets. Baby diapers had competitive growth in several of our markets outside Brazil. Revenue in the Femcare category in Q3 2018 was 0.8% lower, with growth of organic cotton tampon sales remaining high. In Q3 2018, Adult Inco category revenue was up 3.3%, with sales in retail channels 8% higher and stable in institutional channels. Sales of Adult Pants and Light Inco products continued to drive our revenue in this category. Against a high prior year comparable, Q3 2018 Mature Markets Retail Division revenue was 3.9% lower. Following strong, volume-led revenue growth in 2017 and a continuous and material rise in Group input costs, we increased our efforts to improve the contribution from price and mix in 2018. Year-on-year pricing was positive in Q3, reflecting the broad-based price increases designed to partly offset significant Group-wide input cost growth, particularly in raw materials and energy. As expected, sales volumes were negatively affected by these price increases, in an environment where the international branded diaper competitor has maintained an aggressive pricing policy and promotional activity. Product mix was positively impacted by ongoing volume gains in Baby Pants and Adult Incontinence, as new production capacity to support our customers continued to ramp up. We also continued rolling out our new channel core diapers to leading retailer customers, helping them to attract consumers to their stores. The Growth Markets Division delivered a solid revenue performance in Q3 2018, increasing 8.9%. Higher volumes and positive price/mix underpinned the revenue growth, which continued to be broad-based across most categories and geographies. In spite of ongoing rising Group input costs and weaker currency, the competitive environment remains challenging with all Babycare suppliers maintaining a high level of pricing and promotional activity to gain share. Our focus to develop local production capabilities and leverage the Company's deep experience in retailer brands supported market outperformance in the Division. Q3 2018 Healthcare Division revenue was essentially unchanged versus last year, down 0.1%. The sales mix benefited from the growth of Adult Pants and Light Inco products in most markets, which offset a slight decrease in volumes across other products. In light of sustained input cost inflation we will continue to implement efficiency measures and actively manage our customer and product portfolio. Revenue in our Middle East and North Africa Division grew 16.2% in Q3 2018, demonstrating that our actions are delivering results. Despite heightened competition in Babycare, Division sales rose due to both higher volumes and positive price/mix. In Turkey, we benefited from our recently launched Canbebe baby diaper innovation supported by increased in-store activation and digital engagement with consumers, and further growth in Adult Inco, together with pricing actions in light of local currency weakness. We also had good growth in other markets including export sales. Americas Retail Division had 1.9% revenue growth in Q3 2018, on top of a solid comparable revenue performance last year. Outside Brazil, revenue was up 14.2%, as our portfolio of local Babycare and Adult Inco brands in Mexico continued to perform well, and sales in the US accelerated due to strong Babycare and Femcare sales. In Brazil, we relaunched an enhanced Pom Pom diaper, with positive initial consumer feedback, following the relaunch of our Cremer diaper brand a few months earlier. Our team also successfully finalized the consolidation of production onto one site in Brazil, previously announced in May 2018, which includes new proprietary technology that will support our future growth plans. In addition to these operational milestones, we delivered further sequential revenue growth in Q3 2018 of 5% excluding FX versus Q2 2018, and positive adjusted EBITDA. The progress to date confirms that our comprehensive turnaround plan in Brazil is slowly but surely taking effect. Q3 2018 revenue was driven by an increase outside of Europe, despite significant FX headwinds. The majority of Group reported revenue was from outside of Western Europe, in markets which are forecast to continue growing in the mid- to long-term. The above press release and related financial information of Ontex Group NV for the nine months ended September 30, 2018 was authorized for issue in accordance with a resolution of the Board of Directors on November 6, 2018. The following alternative performance measures (non-GAAP) have been included in this trading update since management believes that they are widely used by certain investors, securities analysts and other interested parties as supplemental measures of performance and liquidity. The alternative performance measures may not be comparable to similarly titled measures of other companies and have limitations as analytical tools and should not be considered in isolation or as a substitute for analysis of our operating results, our performance or our liquidity under IFRS. Like-for-like revenue is defined as revenue at constant currency excluding change in perimeter of consolidation or M&A. EBITDA is defined as earnings before net finance cost, income taxes, depreciation and amortisation. Adjusted EBITDA is defined as EBITDA plus non-recurring income and expenses. EBITDA and Adjusted EBITDA margins are EBITDA and Adjusted EBITDA divided by revenue. Net financial debt is calculated by adding short-term and long-term debt and deducting cash and cash equivalents. LTM adjusted EBITDA is defined as EBITDA plus non-recurring income and expenses for the last twelve months (LTM).
{ "redpajama_set_name": "RedPajamaC4" }
7,818
I was showing property on Wednesday afternoon, and several other agents in my office were showing houses to their clients on Wednesday evening (Thanksgiving eve). We began negotiating a contract on this affordable, downtown property via fax, e-mail and telephone on Wednesday, and negotiations continued through Thanksgiving, towards what should be a ratified contract by this evening. Erickson Avenue to Stone Spring Road Connector close to moving forward! Among several other large road projects in the works in Harrisonburg, a new road connecting Erickson Avenue and Stone Spring Road is getting closer to becoming a reality. The City of Harrisonburg received 10 construction bids for this project, which will hopefully start in the spring, and take 18 months to complete. One significant change will be the remove the train trestle at the intersection of South High Street (Route 42) and Erickson Avenue. Read more in today's Daily News Record. Where has your money been? What has it been doing? My aforementioned clients have decided to proceed with building at Preston Lake, and they closed on their lot purchase this week. We're aiming for completion by next May, and I and they are excited to see the building process get underway. It's been a busy few weeks, making selections at Ferguson's and designing the kitchen with Shenandoah Millwork. More details will be forthcoming as we move through the process. If you have questions about buying or building at Preston Lake, feel free to call me (540-578-0102) or e-mail me (scott@cbfunkhouser.com). I suppose plumbers may have been installing thermal expansion tanks (pictured above) for years, but I first starting seeing them a few years ago, and wondered what they were and why they were being installed. A thermal expansion tank's purpose is to reduce thermal expansion that occurs when the water is heated and pressurized. Thus, if your hot water heater does not have an expansion tank, and is leaking or dripping from the pressure-relief valve, it would be wise to install a tank. The other aspect of these expansion tanks that mystified me for quite some time is that I assumed that they would be installed on the hot water line coming out of the hot water heater, but most that I observed were installed on the cold water line. As it turns out, the expansion tank can be installed either on the cold water line between the main valve and the water heater, or on the hot-water line within 3 to 5 feet of the water heater. Do granite counter tops emit dangerous radon gas? Well, in what may put your mind at ease, the Marble Institute of America released a report a few days ago (Nov 17, 2008) indicating among other things that "not one stone slab contributed to radon levels that even reached theaverage U.S. outdoor radon concentration of 0.4 picocuries per liter." If you're eying your granite counter tops warily from across the room, and need to be further reassured, read more from Nation's Building News. The median residential property values went up 2-3% this year in Harrisonburg. One third of Harrisonburg's 12,000 properties did not see a change in value. A 3% increase on the median sales price of a home in Harrisonburg would only equate to a $41 annual increase in taxes. Comments have already begun on the DNR site, several from property owners who are in doubt of the value increases the City of Harrisonburg attributes to their homes. So -- what say you? Did your assessment go up or down? Do you think your new assessed value is accurate? Buyers are lining up to buy a house before the holidays! While shopping at Martins for half an hour this weekend I received 4 phone calls each related to a buyer (or their Realtor) wanting to view properties immediately. Tom was answering general phone inquiries for 2 hours this morning at our office, and received 3 calls from first time buyers who are itching to buy -- 2 of them are looking at properties later today. Web traffic so far this month (Nov 1 - 23) on our company and agent web sites has surpassed full month traffic numbers for July, August, September and October. It will be interesting to review the market report numbers at the end of November and December to see whether all of this buyer excitement turns into closed real estate sales! Shall we have lunch in New York today? Exciting news from The Hook via RealCentralVA . . . Starting next fall, Charlottesville will have round-trip service to New York City, after passing through our nation's capital. With Charlottesville just over the mountain, this may open up some exciting transportation options for us here in the Shenandoah Valley! The real news is that this program (a three year pilot program) will be using state funds to reduce the cost of this new inter-city rail transportation option. Of note -- the funding must still be approved (on December 17) by the Commonwealth Transportation Board. Read more here. The great mystery of showings - when and why do they happen? Especially in our current market, it is very challenging to predict the "popularity" of a house -- in other words, how many prospective buyers will come to view the house. This comes to mind today, because over the weekend two new prospective buyers came on the scene for a very affordable home on East Rock Street. The house has had many showings, and even an offer --- but we haven't had any showings in the past few weeks. So it was a bit surprising to receive two calls within a few hours of each other with interest in viewing the house. After weeks of inactivity, showing activity will sometimes pick up quickly and with unknown cause. On a different note, I have seen several listings lately that have been priced appropriately (in my view), marketed well, and have seen few if any showings in the first few months they are on the market. Sometimes even the houses you are sure will sell quickly, won't even show quickly. Could it be the evil flyer syndrome? And then there are the houses where buyers dutifully tour in and out several times a week, with no results to speak of. I know of several homes on the market right now that have had over 50 showings, and still have not received any offers. Just as it is difficult to predict the pace of a sale in today's market, it is very difficult to predict the pace of showings! New loan guidelines - the underwriter must approve inspection results!? Here is some interesting news that Jon Ischinger (Wells Fargo, Harrisonburg) shared with me last week... If a sales contract is contingent upon a well inspection, septic inspection or termite inspection, the loan underwriter must be privy to any issues, and must approve how the buyer and seller choose to address those issues. And this is on all new FHA and conventional loans! It is my understanding from talking to Jon that this won't change the loan process drastically, but it will make things slower, and may make things more problematic if there are any issues with the termite, well or septic inspections. As Jon notes, "In the past you could have worked something out with the borrower/seller and it would happen very quickly, outside of the lender's scope - now, if there is any inspection mentioned in the purchase contract, we have to see it and send it to underwriting. If any maintenance is required, it will need to be addressed and re-inspected and then that documentation will have to be reviewed." I'm curious to see how this plays out in a real transaction --- but I'm not too surprised to see yet another tightening on the loan process as lenders become more conservative in their lending practices. After missing the 7pm announcement of the JMU Football Playoff Schedule, I was delighted to have such a fast response when asking about the results on Facebook. The good news --- JMU could be at home for all three playoff games leading up to the National Champtionship! Books-A-Million moving to the Valley Mall!? Simon Property Group, interior renovations to existing space to create retail space for Books-A-Million, 1925-400A E. Market St., $1,275,000. It will be interesting to see who takes over the space on East Market Street where Books-A-Million currently exists. The Virginia Resources Authority sold an unprecedented $215 million in infrastructure revenue bonds to raise funds for projects around Virginia. This was, in fact, the largest transaction in the pooled financing program in VRA history --- in what is otherwise a challenging economic time. "Bricks and mortar projects mean jobs and income in Virginia communities," said Sheryl Bailey, Executive Director of the Virginia Resources Authority. "We can't over-emphasize the importance of such projects in stimulating the local and state economy. Infrastructure is a key to America's economic recovery. " According to the Daily Press, the projects to be financed will include "upgrades to bridges and wastewater treatment plants, replacement of water and sewer lines, and construction of a firehouse, a library, and a public safety academy." Affordable Housing with a LOW interest rate! The current VHDA interest rate is already low, at 5.875%. But for first time buyers looking for affordable housing, it can be tough to finance a purchase even at that low rate. So....how about 4.875%? If you're buying at Covenant Heights, a neighborhood being developed by Hope Community Builders (a non-profit group), depending on your income levels, you may be able to have the current VHDA rate lowered by an entire percentage point! And these are nice properties we're talking about here -- duplexes and townhomes with three bedrooms, all of which are pre-inspected and built to EarthCraft and EnergyStar standards. If you have a friend or co-worker who is seeking affordable housing, do them a favor and tell them about Covenant Heights! Exciting web site changes are in the works! I'm getting my hands dirty --- working on some exciting additions to the property search section of my web site --- to let you learn even more about each property for sale. There are at least 5 upgrades I hope to roll out in the next few weeks --- stay tuned, and if you have any suggestions for improvements to my web site, please let me know! How close are buyers coming to the asking price? One issue that often seems to be on a buyer's mind these days is the question of how much they should be able to negotiate off of an asking price for a house. Let's take a look at closings from the past 30 days to provide some insight into what buyers are actually accomplishing right now in the market. The average sale price to list price ratio is 97.39%. The median sale price to list price ratio is 97.87%. If you're considering buying in the near future, you should realize that this is certainly a buyer's market, but you won't necessarily have the ability to negotiate more than 10% off the asking price. Have you eaten at Taste of Thai? The owners will soon be opening a new restaurant downtown. Learn more about Taste of Thai, Prasert, and more at "the state" a local blog that plans to interview many restaurant owners to provide an inside look at dining in Harrisonburg.
{ "redpajama_set_name": "RedPajamaC4" }
543
An extraordinary 79% of 2-16 year olds taking antipsychotics in 2009/10 were prescribed risperidone, a drug at the centre of multi-billion dollar lawsuits in the US and capable of causing irreversible damage to the nervous system and life-threatening diabetes. The number of children taking antipsychotics overall increased 69% between 2007/08 (5,727) and 2009/10 (9,683), 759 of them in 2010 aged 6 or under. The Citizens Committee on Human Rights (CCHR) said an investigation is needed into the potential conflicts of interest between pharmaceutical companies and psychiatrists writing treatment guidelines and the influence of these on prescription trends. Concerns have escalated with federal government plans for GPs to screen 3-year-olds for highly subjective mental disorders when already this group has been heavily marketed to by the pharmaceutical-psychiatric consortium and prescribe 72% of antipsychotics, 86% of antidepressants and 94% of sedative-hypnotics.1 CCHR says Medicine Australia's proposed Code of Conduct disclosing conflicts of interest with doctors and consumer groups is a whitewash, needing federal legislation similar to that in the US which names all doctors receiving more than $100 from pharmaceutical companies. CCHR national executive director Shelley Wilkins says the pharmaceutical-psychiatric industry needs to disclose any funding of psychiatric training manuals, treatment guidelines, and research. In 1998 in Australia, Janssen-Cilag sponsored the development of a Resource Kit for General Practitioners aimed at integrating GPs into outpatient services for young people who were given psychiatric treatment for early psychosis. The treatment algorithm in the Training Pack and subsequent Early Psychosis Treatment Guidelines is similar to that exposed in the recent Texas case against Janssen, which was developed by psychiatrists funded by Janssen around the same time as the Australian training pack and early psychosis guidelines were produced. Rothman exposed how the company's phrase "unrestricted educational grant" was misleading, aimed really at expanding Risperdal's market. Treatment guidelines were turned into a powerful marketing tool, he said.2 Medical education and research were "thinly disguised marketing activities" and "funding of these activities created conflicts of interest that subverted scientific objectivity and professional medical integrity."3 Psychiatrists were paid to develop treatment guidelines with a $65,000 bonus for completing on time. Government officials were given kickbacks to make Risperdal a preferred drug in insurance rebates. Like the US Schizophrenia Guidelines, the Australian Early Psychosis Training Pack, Australian Clinical Guidelines for Early Psychosis and The Diagnosis and Management of Psychosis: A Booklet for General Practitioners, recommended risperidone, quetiapine and olanzapine as treatment for early psychosis/first episode psychosis. PBS expenditure on antipsychotics has soared 3,700% since 1992/93. Expenditure on just three brand-name antipsychotics Risperdal (risperidone), Zyprexa (olanzapine) and Seroquel (quetiapine) year-ending 30 June 2011 reached $343,121,352; total costs with co-payments were $362,728,613.4 Between April 2010 and March 2011, Janssen in Australia spent over $1 million on 563 events attended by psychiatrists, representing 36% of all their sponsored medical events. One J&J employee, Rob Kraner, explained J&J's approach to colleagues: 'One of the reasons Janssen committed substantial funding was to develop treatment guidelines/algorithms for schizophrenia that positioned atypicals as the first line agents (at the time atypicals were usually positioned after conventionals) and test it in a real world setting. The rationale was to develop this approach in Texas, find out the most effective way to roll it out, and then other states could replicate [it] with minimal investment.' (Italics added) [Page 18]. Like the US Schizophrenia Guidelines, between the Australian Early Psychosis Training Pack, Australian Clinical Guidelines for Early Psychosis (including updated 2010 version) and The Diagnosis and Management of Psychosis: A Booklet for General Practitioners, the atypical antipsychotics risperidone, quetiapine and olanzapine are recommended as treatment for early psychosis/first episode psychosis. 2010, October: The Australian Early Psychosis Treatment Guidelines produced by Orygen Youth Health and edited by McGorry and others recommended risperidone, olanzapine, quetiapine and three other atypical antipsychotics. Individual or Class Action lawsuits may be the only way for Australians to learn the truth about all adverse drug events and potential collusion between pharmaceutical companies and psychiatrists conducting industry-funded studies and/or writing treatment guidelines that could influence prescription sales in the country. Mental Health Minister Mark Butler does not appear to monitor conflicts of interest within the mental health system. An investigation should be conducted into whether pharmaceutical company-funding of antipsychotic research and treatment guidelines or potential influence over government preference of prescription psychiatric drugs. How does the Minister account for the 69% increase in 2-16 year olds prescribed antipsychotics between 2007/08 and 2009/10? How does the Minister account for the fact that of 2-16 year olds taking antipsychotics in 2009/10 (9,683) 79% were prescribed risperidone, a drug which is the subject of lawsuits in the U.S. against one of its manufacturers, Janssen-Cilag? What statistics are collected annually on number of children/teens prescribed antipsychotics, antidepressants and psychostimulants through federally-funded youth mental health programs, such as headspace and Orygen Youth Mental Health? If statistics in (3) are not collected, what will the Minister do to obtain such information and make it publicly accessible and accountable? What investigation(s) have been initiated or will be initiated to determine what, if any, conflicts of interest may be influencing government treatment guidelines for psychosis, schizophrenia and depression in children, youths and the elderly? How does the Minister explain the over-representation of risperidone use in the elderly [of all elderly aged 72 and above in 2009/10 taking antipsychotics (87,911), 48% were taking risperidone (42,478), 24% were taking Zyprexa (21,099), and 11% Seroquel (10,203)]? Is the Minister aware of the recent lawsuits in the U.S. against Janssen-Cilag, the maker of Risperdal (risperidone) resulting in $1.9 billion in fines and settlements over misleading promotion and concealing adverse effects risperidone? Does the government have guidelines or any policy that makes Risperdal (risperidone), Seroquel (quetiapine) or Zyprexa (olanzapine) "preferred" drugs to be prescribed in mental health facilities? Does the Minister intend implementing a regulation/law where any government-subsidised healthcare facility, organisation, and or researcher and doctor must disclose complete past and present pharmaceutical company or medical device financial support or affiliation? Can the Minister provide a current breakdown of the number of Australians prescribed antipsychotics, antidepressants and psychostimulants, by age, gender, state and by itemized drug in these classes? Of the total number of 2-16 year olds taking antipsychotics in 2009/10 (9,683) 79% were prescribed risperidone, towering over quietapine at 9% and alonazapine at 7%. NB: The total number of people PBS reports take clozapine is low: 894 in 2007/08, 952 in 2008/09 and 1003 in 2009/10, compared to olanzapine for the same period, 101,012; 105,337 and 104,407 respectively. That means the risk of death could be 87,000 or greater for those taking Clozapine. In Australia, obtaining statistics on psychiatric drug usage, especially by age group and drug is a lengthy and costly affair—upwards of $4,000 per request. Medicare has informed those seeking such statistics that they now must provide a detailed explanation for wanting the information and how it will be used. This is information that should be provided annually in health care statistics as a matter of transparency. See near bottom of this page for statistics references. Early Psychosis Training Pack, Module, 4, p. 6. Early Psychosis Training Pack, Module, 4, pp. 15-16. Richard Gosden, Ph.D., Chapter 10, Schismatic Mind: Early Psychosis, pp. 298-299. "Johnson & Johnson fined $1.1bn in Risperdal case", Herald Sun, 12 April 2012. "Clozapine monitoring least stringent in Australia," Medical Observer, 28 Feb 2012.
{ "redpajama_set_name": "RedPajamaC4" }
349
Top of the Ticket Opinion: Rev. Jeremiah Wright to speak in Detroit at month's end The one piece of good news for Barack Obama in Rev. Jeremiah Wright's latest plan for a return to the pulpit (of a sort), is that it will occur five days after Pennsylvania's April 22 primary (though in plenty of time to have a potential impact on the May 6 primaries in North Carolina and Indiana.) The Associated Press reports that Obama's retired minister, whose occasionally inflammatory exhortations on race in particular and the United States in general created major political angst for his one-time parishioner, will deliver the keynote address on April 27 at a dinner in Detroit sponsored by that city's branch of the NAACP. LaToya Henry, the chapter's communications coordinator, expanded on the invite in an e-mail to Times reporter Ben DuBose. Wright, she said, 'has challenged the nation, challenged our comfort zone and stimulated nation-wide discussion on the issues of how we must move forward together as both a nation and a people. We look forward to his participation' at the NAACP gathering. The dinner's theme, according to Henry, is 'A Change is Going to Come.' The nation, she said, 'is at a crossroads as it relates to the ultimate direction we will take politically, socially and in race relations.... As we prepare for the challenges that we face not only today but as we seek to come together and understand each other as a people for tomorrow, we believe ... that this is a key moment in history to take it to the next level.' The Detroit NAACP branch, she added, 'continues to be bold, progressive and far-reaching in its efforts to both challenge its own membership and the community itself.' It is certainly living up to that pledge with the decision to put Wright into the spotlight. After the furor surrounding his past remarks erupted in mid-March, Wright had been scheduled later that month to deliver a series of sermons in the Tampa, Fla., area -- a trip he'd been making for years. But those appearances were cancelled amid concerns about the media swarm the black churches hosting Wright would have confronted. -- Don Frederick
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
6,258
Spotlight On a Murder (Blu-ray) Arrow Categories: Blu-ray, Drama, Movies, Mystery, Thriller Tags: Arrow, Australia-compatible Blu-ray, Blu-ray When the terminally ill Count Hervé de Kerloquen (Pierre Brasseur, Goto, Isle of Love) vanishes without trace, his heirs are told that they have to wait five years before he can be declared legally dead, forcing them to devise ways of paying for the upkeep of the vast family château in the meantime. While they set about transforming the place into an elaborate son et lumière tourist attraction, they are beset by a series of tragic accidents – if that's really what they are… The little-known third feature by the great French maverick Georges Franju (Eyes Without a Face, Judex) is a delightfully playful romp through Agatha Christie territory, whose script (written by Pierre Boileau and Thomas Narcejac of Les Diaboliques and Vertigo fame) is mischievously aware of the hoariest old murder-mystery clichés and gleefully exploits as many of them as possible. They're equally aware of the detective story's antecedents in the Gothic novel, a connection that Franju is only too happy to emphasise visually at every opportunity thanks to his magnificent main location. A young Jean-Louis Trintignant (The Conformist, Amour) is amongst the Kerloquen heirs. High Definition Blu-ray (1080p) and Standard Definition DVD presentations of the feature, restored by Gaumont Uncompressed French Mono 1.0 PCM Audio Vintage production featurette from 1960, shot on location and including interviews with Georges Franju and actors Pascale Audret, Pierre Brasseur, Marianne Koch, Dany Saval and Jean-Louis Trintignant Original theatrical trailer Reversible sleeve with original and newly commissioned artwork by Peter Strain Duck Soup (Blu-ray) Arrow Case of the Scorpion's Tale (Blu-ray) Arrow Cohen & Tate (Blu-ray) Arrow
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
9,374
#include <cassert> #include <iostream> #include <QApplication> #include <QFileIconProvider> #include <QFileInfo> #include <QSet> #include <QString> #include <QStyle> #include <QUrl> #include <QVariant> #include <libtransmission/transmission.h> #include <libtransmission/bencode.h> #include <libtransmission/utils.h> /* tr_new0, tr_strdup */ #include "app.h" #include "prefs.h" #include "torrent.h" #include "utils.h" Torrent :: Torrent( Prefs& prefs, int id ): magnetTorrent( false ), myPrefs( prefs ) { for( int i=0; i<PROPERTY_COUNT; ++i ) assert( myProperties[i].id == i ); setInt( ID, id ); setIcon( MIME_ICON, QApplication::style()->standardIcon( QStyle::SP_FileIcon ) ); } Torrent :: ~Torrent( ) { } /*** **** ***/ Torrent :: Property Torrent :: myProperties[] = { { ID, "id", QVariant::Int, INFO, }, { UPLOAD_SPEED, "rateUpload", QVariant::ULongLong, STAT } /* Bps */, { DOWNLOAD_SPEED, "rateDownload", QVariant::ULongLong, STAT }, /* Bps */ { DOWNLOAD_DIR, "downloadDir", QVariant::String, STAT }, { ACTIVITY, "status", QVariant::Int, STAT }, { NAME, "name", QVariant::String, INFO }, { ERROR, "error", QVariant::Int, STAT }, { ERROR_STRING, "errorString", QVariant::String, STAT }, { SIZE_WHEN_DONE, "sizeWhenDone", QVariant::ULongLong, STAT }, { LEFT_UNTIL_DONE, "leftUntilDone", QVariant::ULongLong, STAT }, { HAVE_UNCHECKED, "haveUnchecked", QVariant::ULongLong, STAT }, { HAVE_VERIFIED, "haveValid", QVariant::ULongLong, STAT }, { DESIRED_AVAILABLE, "desiredAvailable", QVariant::ULongLong, STAT }, { TOTAL_SIZE, "totalSize", QVariant::ULongLong, INFO }, { PIECE_SIZE, "pieceSize", QVariant::ULongLong, INFO }, { PIECE_COUNT, "pieceCount", QVariant::Int, INFO }, { PEERS_GETTING_FROM_US, "peersGettingFromUs", QVariant::Int, STAT }, { PEERS_SENDING_TO_US, "peersSendingToUs", QVariant::Int, STAT }, { WEBSEEDS_SENDING_TO_US, "webseedsSendingToUs", QVariant::Int, STAT_EXTRA }, { PERCENT_DONE, "percentDone", QVariant::Double, STAT }, { METADATA_PERCENT_DONE, "metadataPercentComplete", QVariant::Double, STAT }, { PERCENT_VERIFIED, "recheckProgress", QVariant::Double, STAT }, { DATE_ACTIVITY, "activityDate", QVariant::DateTime, STAT_EXTRA }, { DATE_ADDED, "addedDate", QVariant::DateTime, INFO }, { DATE_STARTED, "startDate", QVariant::DateTime, STAT_EXTRA }, { DATE_CREATED, "dateCreated", QVariant::DateTime, INFO }, { PEERS_CONNECTED, "peersConnected", QVariant::Int, STAT }, { ETA, "eta", QVariant::Int, STAT }, { RATIO, "uploadRatio", QVariant::Double, STAT }, { DOWNLOADED_EVER, "downloadedEver", QVariant::ULongLong, STAT }, { UPLOADED_EVER, "uploadedEver", QVariant::ULongLong, STAT }, { FAILED_EVER, "corruptEver", QVariant::ULongLong, STAT_EXTRA }, { TRACKERS, "trackers", QVariant::StringList, STAT }, { TRACKERSTATS, "trackerStats", TrTypes::TrackerStatsList, STAT_EXTRA }, { MIME_ICON, "ccc", QVariant::Icon, DERIVED }, { SEED_RATIO_LIMIT, "seedRatioLimit", QVariant::Double, STAT }, { SEED_RATIO_MODE, "seedRatioMode", QVariant::Int, STAT }, { SEED_IDLE_LIMIT, "seedIdleLimit", QVariant::Int, STAT_EXTRA }, { SEED_IDLE_MODE, "seedIdleMode", QVariant::Int, STAT_EXTRA }, { DOWN_LIMIT, "downloadLimit", QVariant::Int, STAT_EXTRA }, /* KB/s */ { DOWN_LIMITED, "downloadLimited", QVariant::Bool, STAT_EXTRA }, { UP_LIMIT, "uploadLimit", QVariant::Int, STAT_EXTRA }, /* KB/s */ { UP_LIMITED, "uploadLimited", QVariant::Bool, STAT_EXTRA }, { HONORS_SESSION_LIMITS, "honorsSessionLimits", QVariant::Bool, STAT_EXTRA }, { PEER_LIMIT, "peer-limit", QVariant::Int, STAT_EXTRA }, { HASH_STRING, "hashString", QVariant::String, INFO }, { IS_FINISHED, "isFinished", QVariant::Bool, STAT }, { IS_PRIVATE, "isPrivate", QVariant::Bool, INFO }, { IS_STALLED, "isStalled", QVariant::Bool, STAT }, { COMMENT, "comment", QVariant::String, INFO }, { CREATOR, "creator", QVariant::String, INFO }, { MANUAL_ANNOUNCE_TIME, "manualAnnounceTime", QVariant::DateTime, STAT_EXTRA }, { PEERS, "peers", TrTypes::PeerList, STAT_EXTRA }, { TORRENT_FILE, "torrentFile", QVariant::String, STAT_EXTRA }, { BANDWIDTH_PRIORITY, "bandwidthPriority", QVariant::Int, STAT_EXTRA }, { QUEUE_POSITION, "queuePosition", QVariant::Int, STAT }, }; Torrent :: KeyList Torrent :: buildKeyList( Group group ) { KeyList keys; if( keys.empty( ) ) for( int i=0; i<PROPERTY_COUNT; ++i ) if( myProperties[i].id==ID || myProperties[i].group==group ) keys << myProperties[i].key; return keys; } const Torrent :: KeyList& Torrent :: getInfoKeys( ) { static KeyList keys; if( keys.isEmpty( ) ) keys << buildKeyList( INFO ) << "files"; return keys; } const Torrent :: KeyList& Torrent :: getStatKeys( ) { static KeyList keys( buildKeyList( STAT ) ); return keys; } const Torrent :: KeyList& Torrent :: getExtraStatKeys( ) { static KeyList keys; if( keys.isEmpty( ) ) keys << buildKeyList( STAT_EXTRA ) << "fileStats"; return keys; } bool Torrent :: setInt( int i, int value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Int ); if( myValues[i].isNull() || myValues[i].toInt()!=value ) { myValues[i].setValue( value ); changed = true; } return changed; } bool Torrent :: setBool( int i, bool value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Bool ); if( myValues[i].isNull() || myValues[i].toBool()!=value ) { myValues[i].setValue( value ); changed = true; } return changed; } bool Torrent :: setDouble( int i, double value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Double ); if( myValues[i].isNull() || myValues[i].toDouble()!=value ) { myValues[i].setValue( value ); changed = true; } return changed; } bool Torrent :: setDateTime( int i, const QDateTime& value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::DateTime ); if( myValues[i].isNull() || myValues[i].toDateTime()!=value ) { myValues[i].setValue( value ); changed = true; } return changed; } bool Torrent :: setSize( int i, qulonglong value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::ULongLong ); if( myValues[i].isNull() || myValues[i].toULongLong()!=value ) { myValues[i].setValue( value ); changed = true; } return changed; } bool Torrent :: setString( int i, const char * value ) { bool changed = false; assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::String ); if( myValues[i].isNull() || myValues[i].toString()!=value ) { myValues[i].setValue( QString::fromUtf8( value ) ); changed = true; } return changed; } bool Torrent :: setIcon( int i, const QIcon& value ) { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Icon ); myValues[i].setValue( value ); return true; } int Torrent :: getInt( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Int ); return myValues[i].toInt( ); } QDateTime Torrent :: getDateTime( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::DateTime ); return myValues[i].toDateTime( ); } bool Torrent :: getBool( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Bool ); return myValues[i].toBool( ); } qulonglong Torrent :: getSize( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::ULongLong ); return myValues[i].toULongLong( ); } double Torrent :: getDouble( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Double ); return myValues[i].toDouble( ); } QString Torrent :: getString( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::String ); return myValues[i].toString( ); } QIcon Torrent :: getIcon( int i ) const { assert( 0<=i && i<PROPERTY_COUNT ); assert( myProperties[i].type == QVariant::Icon ); return myValues[i].value<QIcon>(); } /*** **** ***/ bool Torrent :: getSeedRatio( double& ratio ) const { bool isLimited; switch( seedRatioMode( ) ) { case TR_RATIOLIMIT_SINGLE: isLimited = true; ratio = seedRatioLimit( ); break; case TR_RATIOLIMIT_GLOBAL: if(( isLimited = myPrefs.getBool( Prefs :: RATIO_ENABLED ))) ratio = myPrefs.getDouble( Prefs :: RATIO ); break; default: // TR_RATIOLIMIT_UNLIMITED: isLimited = false; break; } return isLimited; } bool Torrent :: hasFileSubstring( const QString& substr ) const { foreach( const TrFile file, myFiles ) if( file.filename.contains( substr, Qt::CaseInsensitive ) ) return true; return false; } bool Torrent :: hasTrackerSubstring( const QString& substr ) const { foreach( QString s, myValues[TRACKERS].toStringList() ) if( s.contains( substr, Qt::CaseInsensitive ) ) return true; return false; } int Torrent :: compareSeedRatio( const Torrent& that ) const { double a; double b; const bool has_a = getSeedRatio( a ); const bool has_b = that.getSeedRatio( b ); if( !has_a && !has_b ) return 0; if( !has_a || !has_b ) return has_a ? -1 : 1; if( a < b ) return -1; if( a > b ) return 1; return 0; } int Torrent :: compareRatio( const Torrent& that ) const { const double a = ratio( ); const double b = that.ratio( ); if( (int)a == TR_RATIO_INF && (int)b == TR_RATIO_INF ) return 0; if( (int)a == TR_RATIO_INF ) return 1; if( (int)b == TR_RATIO_INF ) return -1; if( a < b ) return -1; if( a > b ) return 1; return 0; } int Torrent :: compareETA( const Torrent& that ) const { const bool haveA( hasETA( ) ); const bool haveB( that.hasETA( ) ); if( haveA && haveB ) return getETA() - that.getETA(); if( haveA ) return 1; if( haveB ) return -1; return 0; } int Torrent :: compareTracker( const Torrent& that ) const { Q_UNUSED( that ); // FIXME return 0; } /*** **** ***/ void Torrent :: updateMimeIcon( ) { const FileList& files( myFiles ); QIcon icon; if( files.size( ) > 1 ) icon = QFileIconProvider().icon( QFileIconProvider::Folder ); else if( files.size( ) == 1 ) icon = Utils :: guessMimeIcon( files.at(0).filename ); else icon = QIcon( ); setIcon( MIME_ICON, icon ); } /*** **** ***/ void Torrent :: notifyComplete( ) const { // if someone wants to implement notification, here's the hook. } /*** **** ***/ void Torrent :: update( tr_benc * d ) { bool changed = false; const bool was_seed = isSeed( ); const uint64_t old_verified_size = haveVerified( ); for( int i=0; i<PROPERTY_COUNT; ++i ) { tr_benc * child = tr_bencDictFind( d, myProperties[i].key ); if( !child ) continue; switch( myProperties[i].type ) { case QVariant :: Int: { int64_t val; if( tr_bencGetInt( child, &val ) ) changed |= setInt( i, val ); break; } case QVariant :: Bool: { bool val; if( tr_bencGetBool( child, &val ) ) changed |= setBool( i, val ); break; } case QVariant :: String: { const char * val; if( tr_bencGetStr( child, &val ) ) changed |= setString( i, val ); break; } case QVariant :: ULongLong: { int64_t val; if( tr_bencGetInt( child, &val ) ) changed |= setSize( i, val ); break; } case QVariant :: Double: { double val; if( tr_bencGetReal( child, &val ) ) changed |= setDouble( i, val ); break; } case QVariant :: DateTime: { int64_t val; if( tr_bencGetInt( child, &val ) && val ) changed |= setDateTime( i, QDateTime :: fromTime_t( val ) ); break; } case QVariant :: StringList: case TrTypes :: PeerList: break; default: assert( 0 && "unhandled type" ); } } tr_benc * files; if( tr_bencDictFindList( d, "files", &files ) ) { const char * str; int64_t intVal; int i = 0; myFiles.clear( ); tr_benc * child; while(( child = tr_bencListChild( files, i ))) { TrFile file; file.index = i++; if( tr_bencDictFindStr( child, "name", &str ) ) file.filename = QString::fromUtf8( str ); if( tr_bencDictFindInt( child, "length", &intVal ) ) file.size = intVal; myFiles.append( file ); } updateMimeIcon( ); changed = true; } if( tr_bencDictFindList( d, "fileStats", &files ) ) { const int n = tr_bencListSize( files ); for( int i=0; i<n && i<myFiles.size(); ++i ) { int64_t intVal; bool boolVal; tr_benc * child = tr_bencListChild( files, i ); TrFile& file( myFiles[i] ); if( tr_bencDictFindInt( child, "bytesCompleted", &intVal ) ) file.have = intVal; if( tr_bencDictFindBool( child, "wanted", &boolVal ) ) file.wanted = boolVal; if( tr_bencDictFindInt( child, "priority", &intVal ) ) file.priority = intVal; } changed = true; } tr_benc * trackers; if( tr_bencDictFindList( d, "trackers", &trackers ) ) { const char * str; int i = 0; QStringList list; tr_benc * child; while(( child = tr_bencListChild( trackers, i++ ))) { if( tr_bencDictFindStr( child, "announce", &str )) { dynamic_cast<MyApp*>(QApplication::instance())->favicons.add( QUrl(QString::fromUtf8(str)) ); list.append( QString::fromUtf8( str ) ); } } if( myValues[TRACKERS] != list ) { myValues[TRACKERS].setValue( list ); changed = true; } } tr_benc * trackerStats; if( tr_bencDictFindList( d, "trackerStats", &trackerStats ) ) { tr_benc * child; TrackerStatsList trackerStatsList; int childNum = 0; while(( child = tr_bencListChild( trackerStats, childNum++ ))) { bool b; int64_t i; const char * str; TrackerStat trackerStat; if( tr_bencDictFindStr( child, "announce", &str ) ) { trackerStat.announce = QString::fromUtf8( str ); dynamic_cast<MyApp*>(QApplication::instance())->favicons.add( QUrl( trackerStat.announce ) ); } if( tr_bencDictFindInt( child, "announceState", &i ) ) trackerStat.announceState = i; if( tr_bencDictFindInt( child, "downloadCount", &i ) ) trackerStat.downloadCount = i; if( tr_bencDictFindBool( child, "hasAnnounced", &b ) ) trackerStat.hasAnnounced = b; if( tr_bencDictFindBool( child, "hasScraped", &b ) ) trackerStat.hasScraped = b; if( tr_bencDictFindStr( child, "host", &str ) ) trackerStat.host = QString::fromUtf8( str ); if( tr_bencDictFindInt( child, "id", &i ) ) trackerStat.id = i; if( tr_bencDictFindBool( child, "isBackup", &b ) ) trackerStat.isBackup = b; if( tr_bencDictFindInt( child, "lastAnnouncePeerCount", &i ) ) trackerStat.lastAnnouncePeerCount = i; if( tr_bencDictFindStr( child, "lastAnnounceResult", &str ) ) trackerStat.lastAnnounceResult = QString::fromUtf8(str); if( tr_bencDictFindInt( child, "lastAnnounceStartTime", &i ) ) trackerStat.lastAnnounceStartTime = i; if( tr_bencDictFindBool( child, "lastAnnounceSucceeded", &b ) ) trackerStat.lastAnnounceSucceeded = b; if( tr_bencDictFindInt( child, "lastAnnounceTime", &i ) ) trackerStat.lastAnnounceTime = i; if( tr_bencDictFindBool( child, "lastAnnounceTimedOut", &b ) ) trackerStat.lastAnnounceTimedOut = b; if( tr_bencDictFindStr( child, "lastScrapeResult", &str ) ) trackerStat.lastScrapeResult = QString::fromUtf8( str ); if( tr_bencDictFindInt( child, "lastScrapeStartTime", &i ) ) trackerStat.lastScrapeStartTime = i; if( tr_bencDictFindBool( child, "lastScrapeSucceeded", &b ) ) trackerStat.lastScrapeSucceeded = b; if( tr_bencDictFindInt( child, "lastScrapeTime", &i ) ) trackerStat.lastScrapeTime = i; if( tr_bencDictFindBool( child, "lastScrapeTimedOut", &b ) ) trackerStat.lastScrapeTimedOut = b; if( tr_bencDictFindInt( child, "leecherCount", &i ) ) trackerStat.leecherCount = i; if( tr_bencDictFindInt( child, "nextAnnounceTime", &i ) ) trackerStat.nextAnnounceTime = i; if( tr_bencDictFindInt( child, "nextScrapeTime", &i ) ) trackerStat.nextScrapeTime = i; if( tr_bencDictFindInt( child, "scrapeState", &i ) ) trackerStat.scrapeState = i; if( tr_bencDictFindInt( child, "seederCount", &i ) ) trackerStat.seederCount = i; if( tr_bencDictFindInt( child, "tier", &i ) ) trackerStat.tier = i; trackerStatsList << trackerStat; } myValues[TRACKERSTATS].setValue( trackerStatsList ); changed = true; } tr_benc * peers; if( tr_bencDictFindList( d, "peers", &peers ) ) { tr_benc * child; PeerList peerList; int childNum = 0; while(( child = tr_bencListChild( peers, childNum++ ))) { double d; bool b; int64_t i; const char * str; Peer peer; if( tr_bencDictFindStr( child, "address", &str ) ) peer.address = QString::fromUtf8( str ); if( tr_bencDictFindStr( child, "clientName", &str ) ) peer.clientName = QString::fromUtf8( str ); if( tr_bencDictFindBool( child, "clientIsChoked", &b ) ) peer.clientIsChoked = b; if( tr_bencDictFindBool( child, "clientIsInterested", &b ) ) peer.clientIsInterested = b; if( tr_bencDictFindStr( child, "flagStr", &str ) ) peer.flagStr = QString::fromUtf8( str ); if( tr_bencDictFindBool( child, "isDownloadingFrom", &b ) ) peer.isDownloadingFrom = b; if( tr_bencDictFindBool( child, "isEncrypted", &b ) ) peer.isEncrypted = b; if( tr_bencDictFindBool( child, "isIncoming", &b ) ) peer.isIncoming = b; if( tr_bencDictFindBool( child, "isUploadingTo", &b ) ) peer.isUploadingTo = b; if( tr_bencDictFindBool( child, "peerIsChoked", &b ) ) peer.peerIsChoked = b; if( tr_bencDictFindBool( child, "peerIsInterested", &b ) ) peer.peerIsInterested = b; if( tr_bencDictFindInt( child, "port", &i ) ) peer.port = i; if( tr_bencDictFindReal( child, "progress", &d ) ) peer.progress = d; if( tr_bencDictFindInt( child, "rateToClient", &i ) ) peer.rateToClient = Speed::fromBps( i ); if( tr_bencDictFindInt( child, "rateToPeer", &i ) ) peer.rateToPeer = Speed::fromBps( i ); peerList << peer; } myValues[PEERS].setValue( peerList ); changed = true; } if( changed ) emit torrentChanged( id( ) ); if( !was_seed && isSeed() && (old_verified_size>0) ) emit torrentCompleted( id( ) ); } QString Torrent :: activityString( ) const { QString str; switch( getActivity( ) ) { case TR_STATUS_STOPPED: str = isFinished() ? tr( "Finished" ): tr( "Paused" ); break; case TR_STATUS_CHECK_WAIT: str = tr( "Queued for verification" ); break; case TR_STATUS_CHECK: str = tr( "Verifying local data" ); break; case TR_STATUS_DOWNLOAD_WAIT: str = tr( "Queued for download" ); break; case TR_STATUS_DOWNLOAD: str = tr( "Downloading" ); break; case TR_STATUS_SEED_WAIT: str = tr( "Queued for seeding" ); break; case TR_STATUS_SEED: str = tr( "Seeding" ); break; } return str; } QString Torrent :: getError( ) const { QString s = getString( ERROR_STRING ); switch( getInt( ERROR ) ) { case TR_STAT_TRACKER_WARNING: s = tr( "Tracker gave a warning: %1" ).arg( s ); break; case TR_STAT_TRACKER_ERROR: s = tr( "Tracker gave an error: %1" ).arg( s ); break; case TR_STAT_LOCAL_ERROR: s = tr( "Error: %1" ).arg( s ); break; default: s.clear(); break; } return s; } QPixmap TrackerStat :: getFavicon( ) const { MyApp * myApp = dynamic_cast<MyApp*>(QApplication::instance()); return myApp->favicons.find( QUrl( announce ) ); }
{ "redpajama_set_name": "RedPajamaGithub" }
4,617
Open House Dublin is the biggest celebration of architecture in Ireland, presented by the Irish Architecture Foundation. Over the weekend, over 100 events will allow visitors to explore the architecture of the city, with special tours by hundreds of professionals and enthusiasts, completely free. The festival showcases the wealth and breadth of Irish architecture, buildings of all types and periods open up their doors from the splendour of Georgian Dublin to the breathtaking contemporary spaces and places. The event showcases the most iconic buildings in the city as well as the smallest, most beautiful interventions. Some of Heritage Island's Dublin members participate.
{ "redpajama_set_name": "RedPajamaC4" }
3,146
Fishermen setting out for the day early in the morning. They will probably catch Snoek, and maybe a little Yellow Tail or Cape Salmon. Muizenberg has always had a very active fishing community, and one of the benefits of living in the Southern part of Cape Town is having a good supply of fresh fish. You can often see the line fishermen fishing on the beach, especially early in the morning and in the late afternoon. Sometimes you see the boats coming in, the smaller boats often landing on the beach, offloading their cargo directly onto the back of bakki's to go off for resale. Driving to work on Strandfontein beach at about 8:00am, I spotted these fishermen taking their small rowing boat out. The boat reminded me of a bygone era of fishing, the days before motors, marine radios and GPS. The sepia effect takes it back in time. You remind me of my childhood day. I grewup in Muizenberg, Our family home use to be on No. 1 Badenpower Drive. My dad Victor Collop worked for Cape City Council as an electrician. My three older brother were fihermen in Muizenberg. Then they had bigger boats four men rowing, on castting the net and the "skipper" Then a fishing crew would be 15 to 20 men. I still remember the owners of the boat the Van Der Pool family and the Skipper was Mr. Ross from Vrygrond. As a boy of 11 years I witness how the boat capsized and how Mr. Ross and two other fishermen drowned. On that day it shoud have been my brothers drowning. One of my brothers Stephen risked his life and saved two men. The following day in the Cape Argus the story appear with a photo of a white man as the hero. Friend, we were forcefully removed from Muizenberg in 1968 and dumped in Parkwood Estate. What a sad day it was. Robert, that is a remarkable and very sad story. A little piece of history. Thank you for sharing it with me.
{ "redpajama_set_name": "RedPajamaC4" }
457
Insight: GTU Highlights for August 2016 The August 2016 edition of the GTU's Insight email newsletter is out, featuring short stories about… The Sylvia Ludins Art Exhibition at the GTU Library A Remembering of DSPT's Father Michael Morris A Cover Feature about the GTU in In Trust Magazine BJRT's Special Issue honoring Arthur Holder and Judith Berling Naomi Seidman's New Book, The Marriage Plot Read the August 2016 Insight here. Each month, Insight draws attention to top news from the GTU and its member schools, academic centers, and affiliates. Use the form to the right to have each monthly edition emailed directly... Dominican School of Philosophy and Theology (DSPT)Library insightBJRTNaomi Seidmanlibrary art exhibitIn TrustArthur HolderJudith BerlingDSPT | Dominican School of Philosophy & TheologyMichael Morris Center for Jewish Studies Welcomes Deena Aranoff as Director For immediate release, January 28, 2016 The Richard S. Dinner Center for Jewish Studies (CJS) at the Graduate Theological Union is pleased to welcome Dr. Deena Aranoff as the Center's new director. Dr. Aranoff, who has worked with CJS and served on the GTU faculty since 2006, takes over the role from Dr. Naomi Seidman, who served as CJS director for the previous 16 years. Dr. Seidman will continue to work with CJS and its students as the Graduate Theological Union's Koret Professor of Jewish Literature. Dr. Aranoff, whose teaching specialties include rabbinic literature and medieval Jewish... CJS | The Richard S. Dinner Center for Jewish StudiesDeena AranoffNaomi Seidman Naomi Seidman Is Awarded Guggenheim Fellowship For immediate release, April 7, 2016 The John Simon Guggenheim Memorial Foundation has announced that Dr. Naomi Seidman, Koret Professor of Jewish Culture at the Graduate Theological Union, has received a Guggenheim Fellowship for 2016. Dr. Seidman is among a diverse group of 178 scholars, artists, and scientists selected to receive the prestigious award this year; Fellows for 2016 were chosen from a field of nearly 3,000 applicants. The Fellowship, granted for Dr. Seidman's work in the field of literary criticism, will support development of her upcoming book, tentatively titled The Navel of... CJS | The Richard S. Dinner Center for Jewish StudiesNaomi Seidman From Mosques to Museums Sarlo Award-winner Dr. Munir Jiwa reflects on art and Muslim identity, interreligious dialogue, and the growth of the GTU's Center for Islamic Studies From the Fall 2015 issue of Currents What does it mean to be Muslim, and how does Islamic tradition find expression in contemporary life? Some might expect to find the answer to such questions by visiting a mosque or speaking with an imam. But throughout his academic career, Dr. Munir Jiwa has sought to address these questions more expansively. "Being Muslim is not just a theological commitment," says Jiwa, "it can also be a cultural or... newsMunir JiwaCIS | Center for Islamic StudiesSarlo AwardCurrents Fall 2015Currents CJS at 50: Remembering Our Beginnings and Becomings From the Spring 2018 issue of SKYLIGHT See a PDF of this article In 1964, just two years after the founding of the Graduate Theological Union as a partnership of Christian seminaries, the school's dean, John Dillenberger, approached the Conservative and Reform Movements to share his interest in establishing Jewish Studies on campus "to stand in its own right in relation to other studies, and not just as an adjunct to Protestant studies." The radical vision of the early GTU is well reflected in its desire to establish a home for Jewish studies supported rather than constrained by its... The Richard S. Dinner Center for Jewish Studies (CJS) Madrasa-Midrasha Program CJS | The Richard S. Dinner Center for Jewish StudiesCJS@50Naomi Seidman (-) Remove Islam filter Islam (-) Remove Ian G. Barbour Chair in Theology and Science filter Ian G. Barbour Chair in Theology and Science (-) Remove Naomi Seidman filter Naomi Seidman
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
4,119
Have you seen couples who are together but seldom stay at peace? Always criticizing and complaining about each other, they always seem to end up in an argument over every small and big thing. Every time you see them you wonder why they are together or maybe they'd be better off alone. Or are you in one such pair where things are just getting more and more toxic with each passing day? Well I've been a part of those gigs and sometimes situations are such that you can't help but burst out. And other times, I had just chosen someone wrong for myself. Gladly, I can now say that a string of associations have made an image clear in my head. Studying other successful relationships around me, I found some common factors required for something to work long term. Oh Boy! Haven't you heard this one before? There are different angles of looking at the same thing, and what most of us do is try to force feed our ideologies onto others. For once just put down your opinion in a way which is less like the start of an argument. Quick Tip – Watch that voice tone. It's funny how fast the topic gets dissolved when one suddenly starts yelling to make their point seem valid. Communication gap has been a devil since the beginning of time. The battle lies when we are able to surpass those and put across our thoughts effectively. Even after a battle of words, make sure that the both of you sit to understand what created the issue and how that could have been handled better. Quick Tip – Make the first move, no matter who's to blame. Trust them even when you aren't sure what they're up to. Place your hands on their shoulders and give them the assurance. Sometimes all a person needs is to know that they're not alone. Be a listener first and if required a speaker.Even when they prioritize other avenues like family and friends before you, don't leave their back. I feel like the biggest sign to know if a relationship is healthy or not, is to see if they have any positive effects on each other. We tend to develop habits of people we spend most of our time with. I have seen people grow after they found their perfect partner, and after a time their present self took a full 360° turn for the good. Quick Tip – Develop good habits together, do small things which will benefit both of you. If they ask you to try something for your benefit, it's about time you listen. Most partners don't have similar routines. It's obvious that there will be moments when you crave for them, but they won't be there. As much as you respect and value your time, please do the same for them. Don't keep them waiting for a call or meeting, knowing that you might not be able to make it. Quick Tip – I know you don't wanna disappoint them, but don't lead them on the wrong path. In most of my past relationships I've spent a considerable amount of my time losing sleep over the future. There's no time for flings, please voice out your feelings. If you are still unsure, tell them and once you feel that you've found a match, be honest. It's not wise to keep someone hanging. Quick Tip – No double games please. Stop messing with their heads, it's not even funny. When transitioning from friends to partners and then spouses, we forget how pure and important still remaining friends is. Sometimes all you need is a friend. A light hearted conversation and a good laugh. Grab some drinks and binge watch some series. Lighten up! Quick Tip – Remember the times you had when you were friends, relive those. Go out and do something fun for a change, instead of usual dates. I believe that life is all about balance, excess or deficiency of anything creates imbalance and thus leads to further problems. When you decide to share your life with someone, there will be times you have to step out of your comfort zone. And that's completely normal. But knowing how to say no and maintaining your personal boundaries are equally important. Quick Tip – Please respect their boundaries. If you are cool doing something, it doesn't mean they would feel the same way. Everyone has a past and before we move on to something new, it's needed that all of our past issues are laid on the table. Little by little, let them know what went wrong with your previous partners and step by step, give them an insight. What you shouldn't do is bombard them with questions and ask for clarifications for actions which doesn't involve you. Quick Tip – Stop judging. Comfort them and make them feel positive. Take mental notes of the things that bug them and do not make the same mistakes their exes did. A typical human behaviour is to run after things in order to achieve it and once you got the fish in your net, start neglecting them. A person changes and evolves every single day. Hence, there will always be new things to learn about them. Always make it a point to let them know that they're special. Little things have quite an impact. Quick Tip – Prove yourself every single day. Win them every single day. May the chase never end. All of the above mentioned points are being tried and tested over the years and are still in process. Our relationships form an integral part of our life and success, any problem no matter how minute it may be, creates a ruckus in our daily routine. I hope the above points help you to have and maintain a healthy relationship!
{ "redpajama_set_name": "RedPajamaC4" }
4,194
the poem and even you. In her second collection, even the overtly science themed poems – 'Einstein's Overcoat', 'Co-evolution', 'Sky Given' and 'Evolution by Engulfment' – serve to remind us that the results of scientific experiment and invention, though integral to our daily existence, have been absorbed 'like English engulfing foreign words.' By re-introducing specialist language to quotidian enquiry, Jenkins expands the poetic vocabulary with which she is able to illuminate previously neglected concepts. Unfamiliar phonemes and unusual rhythms exercise the reader's imagination and provoke speculation: 'we too shape shift by phagocytosis/ imbibing and embedding/ a welterweight of virus […] Why, then I ponder, when plants/ got chloroplasts, did we stop/ at mitochondria?' I indulge the idea that partial comprehension leads to unexpected reverberations: I am reminded of childhood museum visits, and old-fashioned natural history displays with Latin names inscribed on yellowed cards; a time when everything, from the microscopic and to the astronomic, excited my curiosity, before my attention was diverted by the myriad demands of adulthood. My point is that rolling these unfamiliar words around on the tongue for the way they sound and taste offsets the minor inconvenience of opening a dictionary app. In the world of algebra, geometry and statistics, all is not light and joy. Jenkins deploys mathematical metaphors in poems like 'Exit Speed', Set Pieces' and 'Unit of Measurement' to represent disintegrating relationships, damage and survival, and sets up dichotomies to expose the subtle differences between expectation and reality. 'Zero –vs– Nothing' attempts to balance the positive and negative aspects of the man in the equation. Zero is space, or the spaciousness before the man enters the picture. Nothing is the void remaining after he has left, 'a cold space that nothing can illuminate'. 'Climate is what you expect; weather is what you get' charts the challenges and changing meteorology of a marriage: he knows only sunshine; she has raincoat years; arguments are 'a squall of words.' By the time we arrive at 'orographic ascent' it is impossible to decide whether '[a] marriage is what you expect,/ a husband is what you get' is a lament or a relief. until the state of us curdled. In 'Surrender comes with twenty different speeds' the poet acknowledges that winning an argument 'is merely another form of losing.' Concessions have to be made and compromises arrived at. Some relationships survive the demands of domesticity; others do not. In hindsight, we are left wondering if reality was not so bad. you fly on into all atmospheres. This entry was posted in BOOK REVIEWS and tagged Carol Jenkins, Francesca Sasnaitis. Bookmark the permalink. Francesca Sasnaitis is a writer and artist. Originally from Melbourne, she now lives in Perth where she is a doctoral candidate in Creative Writing at the University of Western Australia. Her poetry, short fiction and reviews have been published online and in various print journals and anthologies. Submission to Cordite 52: TOIL Open!
{ "redpajama_set_name": "RedPajamaC4" }
3,522
PCG is proud to launch our new website. We hope that this website will allow everyone to access information and interact with our programs easier. We look forward to updating our pages and posting blogs daily, so check back often. Also, be sure to like our facebook page and follow us on twitter.
{ "redpajama_set_name": "RedPajamaC4" }
2,167
« LOVING ERDOGAN HOPE AND CHANGE IN THE 9TH » How Insane Is Mark Mardell? By David Preiser (USA) | September 14, 2011 - 2:57 am | October 12, 2012 bbc bias and balance., Mark Mardell, Mark Mardell - Obam shill, PRO OBAMA AT ALL COSTS., pro Obama BBC agenda, US economy This insane: The president had waved a copy of his hefty American Jobs Act and told them the USA had to catch up with the likes of China and North Korea in spending on high-speed rail and education. The President of the most successful, most prosperous country in the history of the world says we need to catch up with North Korea, and the BBC US President editor is fine with it. Doesn't bat an eyelash. Okay, I admit I'm being mean. Actually, this is a mad typo. There will probably be a stealth edit tomorrow once somebody points it out to him, so I've taken a screenshot. If/when this gets fixed, I'll post it. Be honest: you thought for a moment that the President actually did say that, right? The President actually said we should emulate South Korea by adding more teachers. Yay, more government spending. I guess it's difficult to churn this stuff out, especially when one has to go out amongst the great unwashed in flyover country in search of the elusive Obamessiah supporters, so I'll be charitable here and shrug off this nutty typo. Anyways, there's something about China's high-speed rail that he doesn't want you to know about. China's high-speed rail project has so far killed 11 people, and injured a further 89 people. And it's losing money hand over fist. And Mardell thinks it's a good idea to emulate? What is it with Beeboids covering the US and China's autocratic ways? Mardell sees nothing fishy – or curiously neglects to point it out – in the President's seeking out the most sympathetic white audience He ever had, back before the 2008 election, when He was still the world's sweetheart: students. The president was talking at Fort Hayes art and design college and one pupil, 18-year-old Mel Dodge, told reporters he was an aspiring lyricist and admires Mr Obama's skills. "He chooses his words so beautifully," said the teenager. "That's why I came out here today, just to hear that in person." That's just the kind of spiritual boost Mardell needed. Actually, it's a high school in US parlance. Which means some of them will be voting next year – like Mel Dodge – and the rest are potential noisemakers on His behalf. I know this is just a language barrier thing and not an attempt to mislead. But there's something else fishy here. As a high school it's part of the state/city-funded school system in Ohio. His Jobs Plan For Us has a couple lines sending over $350 million specifically to Ohio's largest school districts (not colleges). Including the Columbus area, which covers this school. Totally targeted pandering. Oops, Mardell forgot to mention that bit. There's something else about Ohio he doesn't want you to know: Ohio is just about the most key swing state there is when it comes to national elections. The President has spent half of His domestic traveling while in office to swing states. He's visited Ohio fourteen times. Mardell didn't want to inform you of that lest you start thinking too much about how this was an election campaign stop, geared in part towards unions. But now there must be a semblance of non-partisanship: He wasn't so certain about the politics, unsure that the president's jobs plan would work. He wanted to look at the Republican field as well before he decides how to cast his first vote. Yeah, sure. Mardell also has a bridge downtown to sell you. His blogpost is just past its midway point, so it's time for a party political advertisement. If Mr Dodge is not convinced, it won't be through lack of White House effort. Senior advisor David Plouffe is just the latest to offer to answer questions by tweet. This advertisement was brought to you by the Campaign to Re-Elect the President. And your license fee. I've just got a detailed White House email on the beneficial impact of the act on Montana. Why Montana, I know not, but I am sure 49 similar missives will soon be in my inbox. Yes, Mark, we know you're well-connected and on the Democrat mailing list. You don't need to remind us. Oh, my apologies, I'm being rude. The campaign ad is still going. The president will, I guess, be on the road until this is done or dead. "I guess." That's very silly, and pretty disingenuous. We all know that's what's going to happen, because Jay Carney already told everyone last week that the President will "travel all over the country; we'll be to a lot of different places." As if he doesn't know. Hell, the President's travel schedule is given out to the press, and it shows that He's going all over the damn place now, mostly to those magical swing states. Just how stupid does Mardell think his audience is? And it's not partisan at all, no sir. No way you're going to hear from the US President editor that the only job the President is concerned about creating is His own second term. That'd be a bit too much analysis. Instead, Mardell gives us one of the more obvious signs of his personal political bias: He is portraying what is a series of pretty partisan, controversial proposals as plain common sense, that no-one of goodwill could resist. No one of good will, eh? That's a purely emotional phrase. Mardell is clearly giving an ideological position, supporting the President's message. Anyone who doesn't agree with throwing another half trillion dollars down the rabbit hole would resist, based on an entirely different definition of goodwill, but he doesn't see it that way, and tells you so. So now it's time to provide "balance". In fact, there is intense debate about his ideas. There you go. One sentence, with a once-in-a-blue-moon link to a known right-wing pundit, Cal Thomas. If this was from elsewhere, I'd say that might remotely be enough to balance out Mardell's preceding statement that this "debate" obviously means that there are some who are resisting, and therefore have no good will. But as this is a Mardell post, there's more coming to support the President. He got backing on Tuesday for more spending from the Congressional Budget Office's director, who warned cuts could damage recovery. He got backing, sort of. But the CBO's "backing" isn't quite how Mardell presents it. In fact, the CBO boss says that "changes in taxes and spending that would widen the deficit now but narrow it later in the decade." Which is pretty much exactly what the SuperCommittee is going to do. Just like in Britain (not including union bosses and UK-Uncut and their fellow travelers at the BBC), nobody really thinks severe cuts are happening this instant or tomorrow. For the US, it's all about 2013 and beyond, and remember, only in a best-case scenario will there be barely $1.5 trillion cut over the next few years, which is a fraction of the actual deficit we need to cut. That's why nobody on the fiscally responsible side was really happy about the debt agreement. Hello?!!? Short-term memory, anyone? The CBO isn't backing the President in the way that Mardell insinuates, nor are they really repudiating the Tea Party idea and instead siding with Ed Balls and Stephanie Flanders. In fact, what the CBO really says is this (does Mardell think nobody clicks through his links, or does he actually not understand what the following bit means?): "Attaining a sustainable budget for the federal government will require the United States to deviate from the policies of the past 40 years in at least one of the following ways," he said. "Raise federal revenues significantly above their average share of GDP; make major changes to the sorts of benefits provided for Americans when they become older; or substantially reduce the role of the rest of the federal government relative to the size of the economy." Raising revenues doesn't necessarily mean draconian taxes only. Growth in industry and consumption raises revenues as well, since that's all taxed to the eyeballs. So "raising revenues" means a lot more than just taxing the rich even more. And what's that about changing benefits for seniors? Oh dear, oh, dear. Sounds like austerity and cuts affecting the most vulnerable to me, and as Social Security and Medicaid are going to be just about the biggest government expense in the near and long-term future (aging Baby Boomers joining the rolls, longer-lived population in general), it's pretty major. Again, this is way more in line with Tea Party ideals than with Krugman/Balls/Flanders. And that last sentence about reducing the role of the federal government speaks for itself, you betcha. Remember: the CBO boss said "at least" one of these three methods. It's pretty dishonest to put it as just cuts are bad, m'kay, full stop. So is that what Mardell thinks supports the President? One could just as easily say that the CBO statement is more about what the SuperCommittee is going to do than about whether or not we should add another half-trillion to the deficit. Oh, hang on: the CBO boss actually was saying this to their faces. The link Mardell provides is about the CBO boss's first appearance in front of their first official hearing. Nothing to do with supporting yet another Stimulus package at all! Wow. Let's just shake our heads and move on. But the president's plan is ideologically objectionable to most Republicans, even more so now that he has revealed how it would be paid for: by taxing what they would describe as "wealth creators" and what Obama would call the rich, oil companies and corporate-jet owners. It's ideologically objectionable to Republicans, but not to anyone of "good will". How biased can you get? Actually, the President has already caved on the corporate jet issue (perhaps Oprah had a word in his saucer-like?), but never mind. Mardell must have missed that memo. I think a less ideologically blinkered person would mention small businesses as well, as they provide the vast majority of jobs in the country. This is bound to get messy. The White House has confirmed that they will accept parts of the bill being passed. What's that last bit supposed to mean? I thought compromise and bi-partisanship were supposed to be the new American dream? Why is he warning about compromise and bi-partisanship? Weird. Unless one would be unhappy with the President giving in one iota to the nasty Republicans. The danger for Obama is a loss of his simple message. What simple message? Where did we see a simple message? Mardell sounds like he drank the Kool-Aid here. He could get drawn into the wrangling and the less attractive aspects of compromise. He needs all the clarity his lyricism and beautiful words can conjure. So compromise is bad now. Curiously, only a few weeks ago it was the one thing that would have saved us from a credit downgrade. And just the other day Mardell was telling us about how ashamed someone was of Congress for their failure to work together. Now he thinks its best for the President if He convinces everyone to do things His way, without stooping to compromise. You know, that's just what the other of his concerned voter in that post said. Funny, that. It's almost as if there's an agenda here. By the way, that post of his saw Mardell visiting Democrats in Indiana, and he kind of forgot to mention that it's another important election swing state. In the end – wacky typos aside – this is all typical biased reporting. And sloppy and dishonest at that, one of Mardell's worst. What's the emoticon for putting one's head in one's hands and weeping quietly? 29 Responses to How Insane Is Mark Mardell? My Site (click to edit) says: 'There will probably be a stealth edit tomorrow once somebody points it out to him' Possibly sooner than you think…. Not FoundError 404 Sorry, link fixed now. Too blinded by tears of laughter. Interesting that it still seems to be there, after a bunch of advisories and time. So much money, so many staff, so little care and attention. Maybe tell SKY and the BBC will copy in an hour? Even the cherry vultures must be asleep on their perches. Quick, on the intranet! Before tub of Mardell is embarrassed further. 29. Derek_Watson "Sarah Palin makes gaffe, saying North Korea is US ally" Now Mark Mardell gets his Koreas mixed up!: "The president…told them the USA had to catch up with the likes of China and North Korea in spending…" Hopefully, Mark will be reading the comments and correct the error, which is just a typo (rather like Sarah Palin's "gaffe".) N. Korea train system may need a bit of support mind. http://news.bbc.co.uk/1/hi/3649655.stm The defense from defenders of the indefensible will be that Palin is a Presidential candidate or whatever and must be held to a far higher standard, while Mardell is merely a reporter and can't be expected to be error-free every time when he churns this stuff out on deadline. Except one of them posts on his own time on a computer and has an editor, while the other speaks ad lib with no script and no warning time to prepare a remark. Bit geeky, but I just had another gander and it is still up. So, either the £4Bpa news empire that is the BBc fires and forgets and only reads what it writes (hence, even excluding corrective blog posts not inc.), or tub of Mardell really sees N. Korea as a socio-economic powerhouse to emulate. hippiepooter says: I remember the BBC making a big thing out of Sarah Palin getting her Korea's mixed up a while ago. Happens to BBC Correspondent's as well? Who'd'a thunk?! Maybe just a Freudian slip of Mardell's? Wanting to show America up against a Communist State, like they try to do with Cuba over health? Interesting to note that those pointing out the error are being showered with negative ratings. Seems the bubble doesn't like to be confonted with its own delusion. As the BBC steams ahead with its 'newer, faster, better' blog system, I see these ratings being used, especially via 'Editor's Picks', more and more to justify quote choices. Another piece of rigging to the wholesale corruption that is the BBC's claim of speaking for the public. New Headline : "Mark Mardell No Smarter Than Palin – Official". Asuka Langley Soryu says: 'But the president's plan is ideologically objectionable to most Republicans, even more so now that he has revealed how it would be paid for: by taxing what they would describe as "wealth creators" and what Obama would call the rich, oil companies and corporate-jet owners.' Why scare quotes around what the Republicans say but not around what Obama says? Never mind, I'll answer my own question. What the Republicans say is bullshit, and what Obama says is The Truth. Thanks for playing, Mark. pounce_uk says: Just want to say that actually the death toll from the above crash was 40 and that over the past 7 years over 120 people have been killed by the Chinese train system. I'm sure that in some way it will be proved to be our fault Millie Tant says: Revealing post about His travels / electioneering an' all. However, on one point, I read it that Mardell's saying that He (Pres) is portraying them as proposals that no one of good will could resist. Exactly. The no one of good will can resist the President's Plans, ergo Republicans who find them "ideologically objectionable" cannot be of good will. And Mardell doesn't say the President's spending plans are ideological. This is among the most blatantly biased things Mardell has ever said. And he'll continue to get away with it because BBC management thinks the same way. wild says: "wealth creators" i.e. the opposite of Mark Mardell D B says: I wonder if Susanna Reid or Robyn Bresnahan will have anything to say about Mardell's South/North Korea slip. Carruthers says: Why do socialists just love trains so much? Is it the top-down planning, requiring teams of bureaucrats and consultants and impact assessments? The fact that people have to obey timetables like so many compliant hive workers? The deeply unionised workforces that inherently go with railways? Simply their hatred of the car, which per miles travelled can really be no more polluting than massive steel railways, diesel engines and empty carriages being heaved to and fro? The destruction and forced purchase of anyone's property that is in the way of new tracks? The necessity of massive taxpayer subsidies for the rest of time, because the real price of fares is completely unaffordable? High speed intercity rail is considered by the bien-pensants of so many countries to be some sort of ingenious route to the future and prosperity, but in truth it's 19th century technology really isn't it? Routes and timetables are completely inflexible. It's overwhelmingly used by the middle class going to meetings in city centres near the terminus. Working class people tend to work locally to their homes, or need to use the bus/car to get, you know, to the actual place they want to go to. Actually I like train travel, the problem is ours is so crap, partly because of the Unions and partly because the companies who run them are so useless. I prefer yo go by train than car (or flying) when possible, what we have to do though is sort out the bloody leftists. No error, Mardell does think North Korea is a great place to live. ONLY state TV as well. Heaven for a beeboid. I wonder if Mark Mardell in wide-eyed wonder at the big O will talk soon about Solyndra. But sometimes it is what is NOT said that defines a reporter's position on issues. Briefly, Solyndra – a "solar-panel manufacturer" – was granted a pile of money from the White House and became a poster child for O's stimulus package. Surprise, surprise! Solyndra is bankrupt and subject of an FBI investigation. Where did $535m go? Now e-mails have turned up showing the White House pressured the Office of Management and Budget to agree to hand Solyndra stimulus money. The Washington Post (no conservative newspaper) concludes: "The e-mail exchanges could intensify questions about whether the administration was playing favorites and made costly errors while choosing the first recipient of a loan guarantee under its stimulus program. Solyndra's biggest investors were funds operated on behalf of the family foundation of Tulsa billionaire and Obama fundraiser George Kaiser. Although he has been a frequent White House visitor, Kaiser has said he did not use political influence to win approval of the loan". http://www.washingtonpost.com/politics/white-house-pushed-500-million-loan-to-solar-company-now-under-investigation/2011/09/13/gIQAr3WbQK_story.html I will continue to watch for a report on the subject from the BBC. So far there are 33 words in this an opinion piece from Matthew Sinclair of the Taxpayer's Alliance. http://www.bbc.co.uk/news/uk-politics-14886658 Note to BBC newsroom. This is an important story. Follow it. The BBC cenosred the name Kaiser and the fact that he's an Obamessiah bundler out of the one story they ran about this. No reason to expect them to be more honest about it in future. cjhartnett says: One of our number posted James Delingpole yesterday-and HE in turn gave us a link to Sarah Palins speech In Iowa recently( Hillbuzz I think!). I am looking no further behind Obama, Mardell or any of them-I have not seen as clear an expose of the crony corporate capitlaist class in all my years. That speech needs addressing bt eh political class-and the more they try to besmirch or bug Sarah, the more I know she`ll be our girl very soon. If Chomsky had written such a critique he`d be the intellectual that the left says he is…but he coudn`t! Palin is brilliant…"hopey changey,, abd two great barbs at Obame when he was only a candidate( one referred to him writing his own autobiography before he`d even written a policy!) Keep up the posting David…but if Sarahs last speech in Iowa doesn`t yet make giant waves both sides of the pong , then we will only have ourselves to blame…she`s a wonder! God Bless the USA…and the mighty Sarah Palin! Both sides of the pond!… by the political class! Typos R Us! I think in the case of the BBC, it's both sides of the pong. George R says: "Sarah Palin totally gets it " (James Delingpole) http://blogs.telegraph.co.uk/news/jamesdelingpole/100104496/sarah-palin-totally-gets-it/ Margo Ryor says: The interesting thing about all this is as far as I can recall the President is NOT constitutionally empowered to introduce legislation. Granted he has every right to SUPPORT legislation but he isn't supposed to write it – is he? That's what they do in China and North Korea, I'm amazed Mardell hasn't got a "Obama, President for life" t-shirt on. Yes, a President can write legislation. They just can't actually introduce it, and have to get a Congressman to start that in motion. Behold some of those people without good will who resist the President's Jobs Plan For Us: Hill Dems pick apart Obama jobs plan Wait….Dems? If Republicans are "ideologically opposed", what about these guys? Are they racists? This is also from Politico, so Mardell sure as hell knows about it. Yet the spin continues.
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
9,607
All US politicians should be pro-sex education in post-Roe world By Teresa Blackmon On Jul 2, 2022 I'm not here to take a stance on abortion. Debates are raging on both sides, but it doesn't change the fact our Supreme Court struck down Roe v. Wade, and now millions of women in more than a dozen states who previously had access to abortion soon no longer will. Now that we've got here, the question is: how we can minimize the number of women who experience unwanted pregnancy in the first place — particularly in states where access to abortion is now restricted? It's a utilitarian question of harm reduction, and the glaring answer is sexual education. Although concrete statistics are difficult to come by, it is estimated that about half of women who get abortions are using no form of birth control whatsoever at the time of conception, and a further 41 percent are doing so inconsistently. This status quo — where birth control usage is lax and one in five pregnancies is terminated — needs to change. This past week the Supreme Court struck down Roe v. Wade leaving the abortion issue to individual states. Erin Schaff/The New York Times via AP, Pool, File It's also an indictment of our education system. According to the Centers for Disease Control and Prevention, less than half of high schools and less than a fifth of middle schools cover the 20 recommended components of sexual education. And, according to surveys of students, the situation is only getting worse. The share of girls who say they received information about where to get birth control before they had sex for the first time has fallen from 87 percent to 64 percent since 1995. It shouldn't come as a surprise, then, that America experiences higher rates of teen pregnancy than most other developed nations. Comprehensive sexual education is associated with a lower risk of teen pregnancy — and lifelong empowerment to take control of one's reproductive destiny. America experiences higher rates of teen pregnancy than most other developed nations. Sexual education has been shown to lower these rates. Much of this comes down to state and local legislation. Today, only 30 states and the District of Columbia mandate sexual education, just 22 require that instruction be medically and factually accurate, and a mere 19 require that methods of contraception be covered at all. Ironically, many of the states now restricting access to abortion either don't require sexual education or stress abstinence in their instruction. That must change. More than half of American teens will engage in intercourse by the time they graduate. Human beings have sex, and pretending otherwise is foolish. Teaching them to do so responsibly is the only realistic path forward. Education must be used to change the status quo where one in five pregnancies are terminated. REUTERS/Shannon VanRaes Every single US lawmaker — and particularly every pro-life lawmaker — should be proposing bills to shore-up sexual education in schools. Empowering the post-Roe generation to engage in responsible sex is the least we can do, and legislators would find it's a vote winner, too. A whopping 89 percent of likely voters say sex education is important in middle school, and an overwhelming 98 percent think it's important in high school. Now that abortion is no longer legal in many states, we should encourage teenagers to avoid getting pregnant — by teaching them about contraception via sexual education in school. Meanwhile, American parents should also be demanding better reproductive education for their kids. In many areas, sexual education policy falls to local school boards. Just as hordes of concerned parents recently protested the politicization of their kid's curricula at school board meetings, they should call for comprehensive sexual education from their local officials now, too. The bottom line: sex education is more important than ever in a post-Roe world. We must empower the next generation of women with the knowledge to help them take charge of their reproductive freedom. Rikki Schlott is a 22-year-old student, journalist and activist. iPhone 14 cutback rumors questioned by analyst ahead of launch How to create speaker groups for HomePod using Shortcuts
{ "redpajama_set_name": "RedPajamaCommonCrawl" }
733
John Rollo M.D. (d. December 23, 1809) was a Scottish military surgeon, now known for his work on a diabetic diet. Rollo was the first to suggest a low-carbohydrate diet as a treatment for diabetes. Life He was born in Scotland, and received his medical education at Edinburgh. He became a surgeon in the Royal Artillery in 1776, and then served in the West Indies. In 1778 the University of St Andrews made him M.D. He was stationed in St. Lucia in 1778–9 and in Barbados in 1781. His associates included Colin Chisholm on Grenada. Rollo became surgeon-general of the Royal Artillery in 1794, and returned to the Royal Military Academy, Woolwich. There he oversaw the construction of the enlarged Royal Artillery Hospital: the Royal Ordnance Hospital dated from about 1780, and the enlargement was completed in 1806 (the building later became the Connaught Barracks). From 1804 he was inspector of hospitals for the Ordnance. Rollo was frequently consulted about cases of diabetes, and in treatment had some success with the use of a nitrogenous diet. He died at Woolwich on 23 December 1809, and was buried at Plumstead in Greenwich. Diabetes In 1797 Rollo printed at Deptford Notes of a Diabetic Case, which described the improvement of an officer with diabetes who was placed on a meat diet. He was the first to take Matthew Dobson's discovery of glycosuria in diabetes mellitus and apply it to managing metabolism. By means of Dobson's testing procedure (for glucose in the urine) Rollo worked out a diet that had success for what is now called type 2 diabetes. The addition of the term "mellitus", distinguishing the condition from diabetes insipidus, has been attributed to Rollo. Rollo's diet for diabetic patients consisted of "milk, lime water, bread and butter, blood pudding, meat, and rancid fat". He has been described as "the first one to recommend a diet low in carbohydrates as a treatment for diabetes." Rollo collaborated with William Cruickshank, who was the chemistry assistant at Woolwich. In another edition of the work, An Account of Two Cases of the Diabetes Mellitus, published in 1798, other cases were added, and some of Cruikshank's research on urine and sugar in diabetics was included. A further edition appeared in 1806. John Latham supported Rollo's views on the treatment. In 1824 the Encyclopædia Britannica in its article "Dietetics" commented that the diet was successful in repressing the condition of the patients' urine, but that the patients often found the high fat content intolerable. This kind of dietary management continued to the 1920s, being more successful for adults, who might survive some years, than for young patients who typically had only some months of life on it. Other collaborations of Rollo and Cruikshank related to treatments for syphilis involving acids, and published with the work on diabetes; proteinuria; and strontium. Other work Rollo published Observations on the Diseases in the Army on St. Lucia, in 1781; and in 1785 Remarks on the Disease lately described by Dr. Hendy, on a form of elephantiasis known as "Barbados leg". In 1786 he published Observations on the Acute Dysentery. Rollo published in 1801 a Short Account of the Royal Artillery Hospital at Woolwich. He had kept a record of his cases in Barbados, and the Account included a similar table for the Ordnance hospital. In 1804 a Medical Report on Cases of Inoculation supported the views of Edward Jenner. References Further reading Alexander Marble. (1989. John Rollo. In: von Engelhardt D. (eds) Diabetes Its Medical and Cultural History. Springer. pp. 229–234. Attribution Year of birth missing 1809 deaths Alumni of the University of St Andrews Low-carbohydrate diet advocates Scottish surgeons
{ "redpajama_set_name": "RedPajamaWikipedia" }
131
EPUB stands for electronic publication. It's an e-book file format. This format was developed by International Digital Publishing Forum (IDPF) in 2007. EPUB extension allows editors to create and distribute digital publication in one file. Speaking about users, they can download and read EPUB files on different devices such as tablets, computers, e-readers or smartphones.
{ "redpajama_set_name": "RedPajamaC4" }
9,197
Cooke & Bieler Lp bought 37,840 shares as the company's stock rose 14.82% with the market. The hedge fund run by Dan Loeb held 5.25M shares of the public utilities company at the end of 2017Q2, valued at $318.26M, up from 5.00 million at the end of the previous reported quarter. Cooke & Bieler Lp who had been investing in Cohu for a number of months, seems to be bullish on the $622.90M market cap company. The stock increased 0.12% or $0.14 during the last trading session, reaching $116.02. Cohu, Inc. (NASDAQ:COHU) has risen 64.54% since December 11, 2016 and is uptrending. It has outperformed by 18.09% the S&P500. T-Mobile US had 69 analyst reports since August 3, 2015 according to SRatingsIntel. (NASDAQ:TMUS). Robeco Institutional Asset Mngmt Bv has invested 0.22% in T-Mobile US, Inc. The stock increased 1.00% or $0.62 during the last trading session, reaching $62.54. About 30,565 shares traded. It has outperformed by 45.93% the S&P500. Calix Inc (NYSE:CALX) was raised too. Friess Assoc Ltd Com holds 1.27% or 230,145 shares. Investors sentiment increased to 2.2 in 2017 Q2. Its up 0.14, from 1.22 in 2017Q1. Advent Cap Mngmt De invested in 12,000 shares. 32 funds opened positions while 56 raised stakes. Cwm Ltd Llc owns 59 shares for 0% of their portfolio. Marble Harbor Investment Counsel Llc owns 112,700 shares or 2.08% of their United States portfolio. Nationwide Fund Advsr reported 0.07% stake. State Board Of Administration Of Florida Retirement System holds 0% in Cohu, Inc. (NASDAQ:TMUS) or 40,494 shares. William Blair maintained T-Mobile US, Inc. Boston Private Wealth Lc holds 12,535 shares or 0.03% of its portfolio. In that case, its shares would mark a 0.9% decline from the most recent price. Numina Mgmt Ltd Liability Corporation holds 1.08% or 108,661 shares in its portfolio. Connor Clark And Lunn Investment Limited invested in 11,400 shares or 0% of the stock. Cincinnati Indemnity Co holds 19.9% of its portfolio in Automatic Data Processing, Inc. for 48,100 shares. Eleven equities research analysts have rated the stock with a hold recommendation and eighteen have given a buy recommendation to the company. The lowest target is $11.0 while the high is $115.0. Moving average is significant analytical tool used to discover current price trends and the possibility for a change in an established trend. The stock of Bojangles', Inc. (NASDAQ:TMUS) has "Buy" rating given on Monday, October 23 by Cowen & Co. FBR Capital maintained the shares of TMUS in report on Wednesday, January 11 with "Outperform" rating. DA Davidson has "Buy" rating and $50 target. Needham maintained Cohu, Inc. (NASDAQ:TMUS) earned "Buy" rating by Oppenheimer on Thursday, August 10. The company has market cap of $1.29 billion. It is negative, as 58 investors sold TMUS shares while 140 reduced holdings. 278.36 million shares or 0.37% less from 279.41 million shares in 2017Q1 were reported. Royal Bank of Canada reissued a "buy" rating and issued a $76.00 price target on shares of T-Mobile US in a report on Friday, June 2nd. T-Mobile US, Inc. (NASDAQ:TMUS) trades at $62.54 having a market capitalization of $52.68 billion. (NASDAQ:TMUS) showed a growth of 31.17 Percent per annum. Fdx Incorporated holds 0.03% or 16,865 shares in its portfolio. Cantab Ptnrs Limited Liability Partnership stated it has 0% of its portfolio in T-Mobile US, Inc. T-Mobile US (NASDAQ:TMUS) last issued its earnings results on Monday, October 23rd. (NASDAQ:TMUS). Dana Investment Advsrs reported 296,659 shares. The firm owned $28.55 million shares of the company's stock after selling 1.26 million shares during the period. T-Mobile US had a return on equity of 9.99% and a net margin of 5.55%. (NASDAQ:TMUS) or 1,869 shares. Pinnacle Associate Limited reported 155,911 shares stake. Analysts await T-Mobile US, Inc. Moreover, Tekla Capital Management Llc has 3.81% invested in the company for 798,053 shares. It also upped Regeneron Pharmaceuticals (NASDAQ:REGN) stake by 3,267 shares and now owns 6,467 shares. Among holders that decreased their positions, 66 sold out of the stock T-Mobile US, Inc. Therefore 77% are positive. As per Monday, October 2, the company rating was maintained by RBC Capital Markets. As per Tuesday, October 10, the company rating was maintained by Deutsche Bank. Cowen reaffirmed a "buy" rating and set a $70.00 price objective on shares of T-Mobile US in a research report on Friday, August 25th. Three months ago, analysts assigned TMUS a 4.47 rating, which implies that analysts have become more optimistic about the outlook for the stock over the next year. The firm has "Buy" rating by Goldman Sachs given on Tuesday, May 23. On Friday, June 24 the stock rating was downgraded by Nomura to "Neutral". The firm earned "Buy" rating on Tuesday, September 5 by Jefferies. It also reduced its holding in Constellation Brands Inc (NYSE:STZ) by 500,000 shares in the quarter, leaving it with 2.50 million shares, and cut its stake in Charter Communications Inc N. Jada Pinkett Smith Calls Out the Golden Globes for Overlooking "Girls Trip"
{ "redpajama_set_name": "RedPajamaC4" }
8,426
Art Therapy Coloring Page Music Treble Clef 6 images that posted in this website was uploaded by Stocktwitsfx.com. Art Therapy Coloring Page Music Treble Clef 6equipped with aHD resolution x .You can save Art Therapy Coloring Page Music Treble Clef 6 for free to your devices. If you want to Save Art Therapy Coloring Page Music Treble Clef 6with original size you can click theDownload link.
{ "redpajama_set_name": "RedPajamaC4" }
647