text
stringlengths 38
5.1k
| instruction
stringclasses 1
value | input
stringclasses 1
value | output
stringclasses 2
values |
---|---|---|---|
Dust evolution starts at the earliest stage of star formation, during the formation of cores that slowly contract to form pre-stellar cores (PSC), to the collapse of these PSCs into protostars and protoplanetary disks (PPD). The contraction time of PSCs is still a debated question and ranges from less than 1 My [1,2] to 10 My lifetime [3]. PSCs are dense (>10 5 H . cm −3 ), compact (< 10 4 AU) and cold (5-12 K) objects, making their study difficult. They are very opaque and most gaseous species are depleted onto grains preventing the study of the inner parts. Dust is the only tracer that is present from cloud edge to the densest part, allowing to characterize cloud density structure. Nevertheless, dust itself is a poor tracer in visible and near-infrared (NIR) since its absorption is too high to detect the reddening of the stars above A V ∼ 50 mag. Recent observations of PSCs with Spitzer [4][5][6] and PPDs with SPHERE [7] in scattered light has brought hope to put more constraints on dust properties. Indeed, all parameters involved in the dust emission (dust temperature, density, grain size, emissivity and spectral index) are varying across the cloud leading to degenerate solutions when dust emission is used alone [8]. Studies combining dust scattering and emission reproduced successfully PPD observations, allowing to constrain the geometry and density structure in PPDs [e.g. 9]. On the other hand, a consistent multi-wavelength modeling of PSCs remains an unachieved goal. Grain emissivity in the far-infrared (FIR) has been linked consistently to its absorption efficiency in the near-infrared (NIR) at short wavelengths only [10] or at low resolution with 5 Planck **data** [11], and thus never reproducing observations of the densest part. With our on-going NIKA2 open time Program, we aim at building a consistent multiwavelength picture of two neighbor molecular clouds, L183 and L134 (116±6 pc [C. Zucker priv. comm.] and 107±5 pc [12] respectively) hosting 4 PSCs. | Is data availability statement | no |
|
For semi-quantitative polymerase chain reaction (PCR), half a microliter of the first strand cDNA products were used for each gene amplification. PCR conditions were as follows: denaturation at 94°C for 30 s, annealing at 60°C for 30 s, and an extension step at 72°C for 1 min. Linearity between the amount of input RNA and the final PCR products was verified. The transcriptional level of b-actin was used as an internal control. Specific primers (Supplementary Table S1) were designed according to gene sequences of MALDI-TOF-MS-identified proteins and the local transcriptomic database of sweet potato. The realtime PCR was performed by the IQ5 real-time PCR system (BIO-RAD, California, USA) in a total volume of 20 lL containing 100 ng of cDNA template, 19 SYBR Ò Premix Ex Taq TM II (Perfect Real Time, TaKaRa), and a 400 nM concentration of each primer. Serial dilutions of each cDNA were used to generate a quantitative PCR standard curve to calculate the corresponding PCR efficiencies. After initial denaturation at 95°C for 30 s, the amplification was carried out through 40 cycles, each consisting of denaturation at 95°C for 5 s, primer annealing at 60°C for 30 s, and DNA extension at 72°C for 30 s. Melting curves were obtained, and quantitative analysis of the **data** was performed using the BIO-RAD IQ5 standard edition optical System software (version 2.1) in a normalized expression (ddCt) model. Transcriptional levels of Ib-RCAs and Ib-RCAl were normalized to b-actin gene. Specific primers of Ib-RCAs, Ib-RCAl and b-actin for real-time PCR were list in Supplementary Table S1. Results were obtained from three biological replicates. | Is data availability statement | no |
|
As we want to calculate the correlation between clinical and wastewater data over a period where they are supposed to be similar and thus where the WWI is supposed to mainly capture a majority of people also likely to be diagnosed , we decided to focus on the period corresponding to the second and third waves of the epidemic in France . To avoid being biased by the movements of individuals during the 2020 summer vacations , we consider the start date of the second wave as September the 1st , 2020 , from which the majority of holidaymakers returned to their residence city . We consider that the last point of the interval of interest is the date from which the signal undergoes a new growth phase following the decay of the second peak of the epidemic . | Is data availability statement | no |
|
Our model included rainfall **data** recorded by the Department of Hydrology and Meteorology (DHM) for Bharatpur city at the Bharatpur station (from 2000 to 2016; Table 3), water-level **data** for the Narayani River at Devghat station, and for the Rapti River at Rajayia station. The bankfull levels of the Narayani and Rapti rivers were considered as the design water level for Bharatpur. The surveyed cross-sections of the Narayani and Rapti were interpolated to generate bankfull water level **data** at the outfalls of the Pungi and Kerunga canals. For Sylhet city, the research team had access to water and rainfall data. Therefore, a once in 50 year return period was considered as the design flood level for the Surma River, for the design of the regulator and pumps (see section 2.3.3). | Is data availability statement | no |
|
if |scope(child 1 ) ∪ scope(child 2 ) | ≥ nV ars then createM ixture(root , child 1 , child 2 ) else createM ultivariateGaussian(root , child 1 , child 2 ) end if end if for each child of root do oSLRAU ( child , data ) end for else if isSum(root ) then for each child of root do subset ← { x ∈ data | likelihood(child , x ) ≥ likelihood(child , x ) ∀child of root } oSLRAU ( child , subset ) w root , child ← n child +1 nroot+#children end for else if isLeaf ( root ) then update mean µ ( root ) based on Eq . 3 update covariance matrix Σ ( root ) based on Eq . 4 end if Figure 3 shows the data points along the first two dimensions and the Gaussian components learned . We can see that the algorithm generates new components to model the correlation between x 1 and x 2 as it processes more data . | Is data availability statement | no |
|
Author contributions: LAC, DAP conceived the study and wrote the manuscript; LAC analyzed the data; NP contributed to the methods used and **data** collection on algae, including cyanobacteria; RR assigned reproductive rates of species. | Is data availability statement | no |
|
Congestion control. The TCP has a certain capacity called transfer window. If we want to send **data** from Point A to Point B we load **data** into the transfer window and wait for an acknowledgement. Point B will send an acknowledge signal telling Point A that all those packets have been received. If we're successful, then the TCP becomes optimistic in the sense that it widens the transfer window so that it can send more **data** at the same time. If the transfer failed for whatever reason, then the transfer window shortens. This produces a slower traffic. TCP makes use of sequence numbering, congestion window and retransmission timer mechanisms to achieve less congestion and reliable service. TCP sender assigns sequence number for every packet sent and expects an acknowledgement before proceeding with further **data** transfer. Congestion window is used to perform congestion control, which keeps track of the number of packets that can be sent by the sender without being acknowledged by the receiving side. Basically, congestion control window decides whether TCP sender is allowed to send packets at any particular instance. TCP accomplishes reliable **data** delivery by deploying retransmission timer mechanism which detects packet loss and retransmits them. If an acknowledgement is not received before the expiry of the retransmission timer, TCP retransmits the packet and triggers congestion control. | Is data availability statement | no |
|
An alternative strategy is to use an unsupervised approach , which defines relevant regions across the genome first , independently of any phenotype information , and then tests methylation levels in these predefined regions against a phenotype ( 19,20,26 ) . In this study , we propose a new unsupervised approach for testing differential methylation in regions against continuous phenotype such as age , tumor size or marker protein concentrations . Note that in the proposed mixed effects model ( Supplementary Text , Section 1 ) , the phenotype variable is included as an independent variable and the methylation values as the outcome variable . Therefore , no distributional assumptions ( e.g. normal distribution ) are made for the continuous phenotypes . Table 1 lists several previously proposed unsupervised methods . A challenge with unsupervised approaches is their lack of specificity . Unlike gene expression data , the regional boundary of DNA methylation is often not well defined . Therefore , currently available approaches that summarize methylation levels in a region using mean or median methylation levels of the CpGs within the region may have results that vary depending on the boundaries of the region . In addition , when testing associations between phenotype and the summarized methylation levels in a genomic region , the spatial correlations between CpG sites within the region is ignored . | Is data availability statement | no |
|
We illustrated the performance of our proposed sampling design using **data** from the National Wilms' Tumor study (NWTS) (D'angio et al., 1989;Green et al., 1998). The **data** consist of N = 3915 observations. The variables available for all the individuals include histology evaluated by institution (favorable vs. unfavorable (instit)), histology evaluated by central lab (favorable vs. unfavorable (histol)), stage of disease (I-IV (stage)), age at diagnosis (age), diameter of tumor (tumdiam), study (3 vs. 4 (study)) and indicator of relapse (relapse). We assumed central lab histology was only available at phase 2 and was the variable of interest. All the other variables were assumed to be available for the whole cohort. Following (Kulich and Lin, 2004;Breslow et al., 2009b;Rivera and Lumley, 2016), a similar outcome model was fitted as P r(relapse|histol,age 1 ,age 2 ,stage 1 ,tumdiam) = expit(β 0 + β 1 histol + β 2 age 1 + β 3 age 2 + β 4 stage 1 + β 5 tumdiam + β 6 tumdiam × stage 1 ), | Is data availability statement | no |
|
Our previous research found that while many organisations across Canada and in many other countries, including England, Sweden, Australia and New Zealand, measure PCC using patient experience measures, few organisations use quality indicators to monitor and evaluate PCC. 22 Furthermore, in a 2019 scoping review of the literature, we also found scarce evidence on the implementation of PC-QIs and evaluation of their use, highlighting a significant gap in the literature. 31 However, some studies have explored the factors that influence the implementation and use of quality indicators, although not specifically focused on measuring PCC. Our findings are consistent with previous research on quality indicator implementation in various care settings and studies on the implementation of PREMs and PROMs. 10-12 32-34 Challenges associated with knowledge, skills (need for training), time constraints and motivation around measurement have been widely reported. Important facilitators to support quality indicator implementation that have also been reported include the need for administrative support for clinicians, 10 the importance of electronic **data** systems, 12 32 and alignment with national and regional priorities, 11 the need to integrate measurement within established workflows to minimise patient care disruptions, as well as the uncertainty around the benefits of using patient-reported data. 32 Our findings also suggest inequitable uptake of personcentred QI, where organisations that are least resourced may also be in most need of improved quality of care that is person-centred. This includes organisations in Canada's Northern Territories, which are home to a larger proportion of Indigenous communities, relative to other areas of Canada, as well as rural primary care clinics, where people experience challenges with remote access to services. Rolnitsky et al conducted a 2018 mapping review of the literature to measure the representation of vulnerable populations in QI studies. 35 They found that while one-third of QI research is focused on vulnerable populations, some populations are under-represented (less than 2%), including rural residents, the poor, visible minorities, the terminally ill, adolescents and prisoners. 35 Moreover, in Canada, as well as Australia, New Zealand and the USA, inequities related to the quality of care for Indigenous people are well documented. 36 These gaps that have been identified in research, including this study, suggest a need for increased attention to more equitable implementation of QI, especially focused on promoting PCC. | Is data availability statement | no |
|
One model of the three - dimensional fold of the fibril consisting of five b - sheets per monomer has been proposed [ 13b ] based on ss - NMR and cryo - electron microscopic data . However , long - range restraints at the molecular level in support of this model are still missing . Recently , we have reported pulsed EPR distance measurements to determine the intra - molecular distance between the extremal b - strands in aS fibrils . [ 15 ] A pair - labeling strategy was introduced to infer the orientation of the standard methanethiosulfonate ( MTSL ) label with respect to the plane of the b - strand , as the detected spin - bearing nitroxide of MTSL is located about 0.5 nm from the Ca of the amino acid the spin label is attached to . The resulting distance of ( 4.5 AE 0.5 ) nm between the extremal strands was consistent with the size of the subfilament measured in AFM and cryo - electron microscopic studies . [ 12 , 13b ] Nevertheless the dipolar oscillations observed in the pulsed EPR data , used to extract distances , were very weak and experiments at two different EPR frequencies were required to exclude artifacts . Similar observations were reported in other recent pulsed EPR distance measurements on amyloid fibrils of human islet polypeptide and tau protein . [ 16a , b ] The challenge of these experiments is caused by the required labeling procedure combined with the structural instability of these proteins with respect to point mutations . [ 17 ] To date , despite the wealth of NMR and other spectroscopic data , no EPR distance measurements on aS fibrils have been reported by other groups . In the present study we illustrate how an optimized sample preparation has provided large signals in EPR distance measurements for several selected mutation sites . The result permitted to exploit a labeling strategy to measure distances between largely conserved b - strand regions and to obtain vectorial information on the spatial arrangements of the labels within the strands . The distances and two - dimensional coordinates reported here provide new long - range restraints for the structure of aS fibrils . | Is data availability statement | no |
|
The following **data** were taken from April 1995 to December 2015 and used in the analyses: Pollination/crossing number, female parent, ploidy of the female parent, male parent, ploidy of male parent, date of pollination, harvest date, date of seed extraction, total number of seeds, number of good seeds (with black hard integuments), number of bad seeds (with brown soft integuments), date of embryo extraction, number of extracted embryos, number of germinated embryos after two months, contaminations, number of weaned plants, and number of hardened plants ready for planting in the field. The total number of seeds per cultivar, total number of bunches of a cultivar pollinated but without seed, highest seed per pollinated bunch, mean of seeds per bunch and standard errors, and pollination success for triploid and tetraploid cultivars were calculated. Likewise, the total number of seeds per diploid used as male to pollinate cultivars, total number of bunches of cultivars pollinated by a particular diploid male but without seed, highest seed per bunch when pollinated by a particular diploid male, mean of seeds per bunch pollinated by particular diploid male and standard errors, and pollination success for 29 diploid males were calculated. The pollination success throughout the 21 years under study was computed as: | Is data availability statement | no |
|
In contrast, our paper does not need long-term outcome observations in the experimental **data** but only need them in observational data. Moreover, our paper does not view short-term outcomes as proxies for the long-term outcome, so we avoid these previous surrogate criteria. Instead, we view them as proxies for unmeasured confounders to correct for confounding bias. See also discussions in Section 2.3. | Is data availability statement | no |
|
The main limitation of logistic regression is that it can only be used to predict discrete functions. Thus, the dependent variable of Logistic Regression is restricted to the discrete set. This restriction is prohibitive to predicting continuous **data** (Al Shamali, 2015). | Is data availability statement | no |
|
Lastly, interrupted time series analysis was employed to examine COVID-19's impact on police reactive and proactive activities. In time series data, observations have a natural temporal ordering. Observations close in time may have stronger correlations than observations further apart. Because of this non-independence or autocorrelation, t-tests may not be able to detect the intervention effect adequately. Interrupted time series analysis using ARIMA modeling, which not only features a quasiexperimental design, but also takes into consideration of the autoregressive moving-average process of the data, would produce statistically more rigorous findings. Based on the results of the augmented Dickey-Fuller tests, as well as examinations of both an auto-correlation function (ACF) and partial auto-correlation function (PACF), ARIMA models for different types of policing activities were identified and analyzed. A binary variable (before-after-March 12, 2020) was set up to examine the interrupted effects of the COVID-19. In the case of examining the patrol and DRT self-initiated activities, to control for the step change between the emergence of COVID-19 and the death of George Floyd, the indicator of the George Floyd incident was also included in the models. Fig. 1 presents the time sequence trend graphs for each reactivity and proactivity measure. With respect to reactivity, violent crime calls appeared to increase slightly after COVID-19. For calls for property crime, there appeared to be a reduction coinciding with COVID-19, but then a return to pre-COVID levels. Disorder calls varied greatly pre-COVID, with observable increases and decreases, but appeared to increase after COVID-19. Suspicious incident CFS exhibited a decreased trend that continued after COVID-19, but with periodic fluctuations. Traffic-related CFS notably declined after COVID-19. The visual examination of the plotted time series **data** for serviced-related activity calls suggested a decrease with the emergence of COVID-19. For non-crime events, there appeared to be a decreased trend before and after COVID-19. | Is data availability statement | no |
|
In general, the relationship between yield strength σ y and grain size d of polycrystalline materials can be expressed by Hall-Petch formula as [1,2] σ y = σ 0 + kd -1/2 (2) The effect of rare earth elements Y and Sm on the grain-refinement strengthening of magnesium alloy AZ81 can be calculated by Hall-Petch formula. Assuming that the grain sizes of AZ81 and AZ81+1.8 (Y+Sm) alloys are d 1 and d 2 , and the yield strengths are σ y1 and σ y2 , respectively, it can be obtained from the formula (2) that σ y1 = σ 0 + kd 1
-1/2 (3) σ y2 = σ 0 + kd 2 -1/2
(4) According to the above two formulas, it can be obtained as follows σ y2 -σ y1 = k (d 2 -1/2 -d 1 -1/2 ) (5) Combined with the above results of microstructure analysis (see Figure 1), the grain sizes d 1 =120 µm and d 2 =40 µm can be obtained by the linear intercept method. The k value of commonly used magnesium alloys is 280-320 MPa·µm 1/2 [1,2] , and here k = 300 MPa·µm 1/2 . By substituting these **data** into formula (5), the contribution of grain-refinement strengthening by Y and Sm to the yield strength of magnesium alloy AZ81 can be calculated as σ y2 -σ y1 = 300×(40 -1/2 -120 -1/2 ) = 20 MPa Combined with the above tensile test results (see Table 1), after the addition of 1.8%Y and Sm, the yield strength of AZ81 alloy is increased by 186 -144 = 42 MPa Among which, the contribution of grain-refinement strengthening is 20 MPa, accounting for 47.6% of the total increase of yield strength. Therefore, grain-refinement strengthening is a very important strengthening mechanism for magnesium alloys, and its contribution to the strength of magnesium alloys should be paid enough attention. | Is data availability statement | no |
|
Step 2. Gluing identity. Consider the vector bundle U d := π * V d → M d (X), restricted to the fixed point components F r . A point in (C, f ) in F r is a pair (C 1 , f 1 , x 1 )×(C 2 , f 2 , x 2 ) of 1-pointed stable maps glued together at the marked points, ie. f 1 (x 1 ) = f 2 (x 2 ). From this, we get an exact sequence of bundles on F r :
0 → i * r U d → U ′ r ⊕ U ′ d−r → e * V → 0.
Here i * r U d is the restriction to F r of the bundle U d → M d (X). And U ′ r is the pullback of the bundle U r → M 0,1 (d, X) induced by V , and similarly for U ′ d−r . Taking the multiplicative characteristic class b, we get the identity on F r :
e * b(V )b(i * r U d ) = b(U ′ r )b(U ′ d−r ).
This is what we call the gluing identity. This may be translated to a similar quadratic identity, via Step 1, for Q d in the equivariant cohomology groups H * S 1 (W d ). The new identity is called the Euler **data** identity. | Is data availability statement | no |
|
Examples of typical cyclic voltammetry scans and long term stability tests of the 0.25 cm 2 photoelectrochemical tandems are found in Figure 5c , d , respectively . The data for all tandems can be found in Figures S22 - S25 of the Supporting Information . | Is data availability statement | yes |
|
One example of USU effects, formulated by considering measurements of neutron-induced fission cross-section ratios of 238 U/ 235 U, highlighted the point that different USU estimation approaches result in slightly different but yet consistent evaluated mean values and uncertainties. Under/overestimation of reported uncertainties have been found. This example illustrated the subjective nature of estimating the effect of USU. However, this subjectivity does not apply only to estimating USU effects but also holds true for evaluating mean values and covariances in general. After all, these derived values depend on the evaluation techniques utilized, input **data** selected and subjective choices made on the parameters of models, corrections of experimental **data** and on methods for estimating covariances of both, model and experiment. Hence, our evaluated results are always subjective-with and without consideration of USU. However, if one neglects adding obviously necessary contributions from USU to uncertainties of input data, the evaluated mean values might be more biased and the evaluated uncertainties will be underestimated, in turn, adversely impacting application calculations. | Is data availability statement | no |
|
Theory Basis
Largest System (electrons) err ρ KZG [29] iFCI finite element * C 6 H 4 (40) 3.4 × 10 −5 − 8.2 × 10 −5 modified-RKS [48] CASSCF gaussian/cc-pCV5Z ** HCN (14) 5 × 10 −4 SGB [50] MRCI gaussian/cc-pCVTZ ** F 2 (18) 3.8 × 10 −4 − 6.7 × 10 −3 DCEP-MRA [49] HF MRA * C 5 H 5 N 5 (70) 2.8 × 10 −6 − 4.2 × 10 −6 †
* complete basis ** incomplete basis † the calculation for the target density (ρ **data** ) and the inverse DFT problem are done in the same basis, which allows for better accuracies in the density (see Supplementary Table 1 in [29]) | Is data availability statement | no |
|
Altogether, our **data** demonstrate that the substrate promiscuity of both 4CL5 and HCBT can be exploited for biological synthesis of structurally diverse cinnamoyl, dihydrocinnamoyl, and benzoyl anthranilates of potential pharmaceutical value. | Is data availability statement | no |
|
In the description logic community, the emphasis has not been on views, but on querying incomplete information with constraints. Our positive query implication problems relate to work in the description logic community on hybrid closed and open world query answering or DBoxes, in which the schema is divided into closed-world and open-world relations. Given a Boolean CQ, we want to find out if it holds in all instances that can add facts to the open-world relations but do not change the closed-world relations. In the non-Boolean case, the generalization is to consider which tuples from the initial instance are in the query answer on all such instances. Thus closed-world and open-world relations match our notion of visible and invisible, and the hybrid closed and open world query answering problem matches our notion of positive query implication, except that we restrict to the case where the open-world/visible relations of the instance are empty. It is easy to see that this restriction is actually without loss of generality: one can reduce the general case to the case we study with a simple linear time reduction, making a closed-world copy R ′ of each open-world relation R, and adding an inclusion dependency from R ′ to R. As with the database community, the main distinction between our study of the Positive Query Implication problem and the prior work in the DL community concerns the classes of constraints considered. Lutz et al. [LSW12] study the complexity of this problem for the constraint languages EL and DL-LITE, giving a dichotomy between co-NP-hard and first-order rewritable sets of constraints. They also show that in all the tractable cases, the problem coincides with the classical open-world query answering problem. Franconi et al. [FIS11] show co-NP-completeness for a disjunction-free description logic. Our results on the **data** complexity of PQI consider the same problem, but for decidable constraint languages that are more expressive, and in particular, can handle relations of arbitrary arity, rather than arity at most 2 as in [LSW12,FIS11]. | Is data availability statement | no |
|
We begin by discussing what we refer to as the ' informed ' ( or ' critical ' ) view in contrast to Anderson 's popular view ( Sect . 2 ) and explain what we mean by big data and machine learning methods ( Sect . 3 ) . We then elaborate on two paradigmatic uses of big data and machine learning methods in science : skin cancer detection and protein folding ( Sect . 4 ) . On this basis , we develop a taxonomy of expert knowledge that can be applied to big data and machine learning - driven research ( Sect . 5 ) . Finally , we argue that this taxonomy provides a fresh perspective on several debates surrounding the role of big data and machine learning in science : whether they concern inductive methods , the removal of the need for theory , or the constitution of a new scientific paradigm ( Sect . 6 ) . | Is data availability statement | no |
|
These observations can be made with standard analysis techniques, although they require processing large amounts of **data** with demonstrable reliability. Each minute of recording uses ~3 megabytes of storage. These files must be checked for baseline stability and extraneous channel activity, then processed to determine Na + channel amplitude, duration, and frequency. This task was made easier using mean-variance analysis, which condensed minutes of **data** into a small set of twodimensional mean-variance histograms. Such histograms display the quality of the data, and permit subsequent analysis of both open times and Po for components with defined amplitude. Use of this technique greatly facilitated obtaining **data** of the highest quantity and quality. | Is data availability statement | no |
|
Limitations of the study include: firstly, some patients did not have sputum smear results because the process of grading, capturing and monitoring of sputum smear results as part of routine **data** commenced only during the period 2004-2005; secondly, there is potential misclassification of the outcome variable as culture investigations were not considered; thirdly, it was not possible to establish the reasons for delay in commencing TB treatment after diagnosis; fourthly, findings relating to HIV status should be interpreted with caution as the HIV results were not recorded for most of the patients due to the aforementioned reasons; and fifthly, the study was based only on the routinely collected **data** and therefore did not consider other important variables that could influence the outcome variable, including chest radiograph results, patient symptoms at diagnosis, as well as nutritional status. Nonetheless, as far as could be established, this is the first study in South Africa which has assessed the determinants of two-month sputum smear non-conversion using routinely recorded patient information. | Is data availability statement | no |
|
Interview data were collected from 31 respondents selected from various job categories including pilot , air combat officer ( ACO ) , engineer , security , intelligence , logistics , administration ( PCO ) and medical . This cross - section of the Royal Australian Air Force ( RAAF ) provided a sample size enabling deeper analysis . The sample included executive officers ( N = 9 ) , senior officers ( N = 5 ) and junior officers ( N = 17 ) , both male ( N = 24 ) and female ( N = 7 ) . Table 1 illustrates the demographic data of participants . | Is data availability statement | no |
|
In order to enhance the performance of PDLF-Net, we augment all the weakly visible EMs and their GT images. Then joint pairwise features are extracted from each image and concatenated to different blocks. Each block is trained and tested independently for each dataset class. Comparing the observation performance from figure 11, the PDLF-Net shows better segmentation results. For instance, in **data** class DC3 and DC4 (2 nd and 3 rd rows from the top) SegNet in (c) has not been able to show the foreground while there is a good segmented output of the same image by PDLF-Net in (e) and (f). In DC8 (last row), SegNet over-segments the image while good visual results are observed by PDLF-Net when concatenation of pairwise feature is at block 2, 3 and 5. Generally, the visual results show great improvement of segmentation results when using PDLF-Net. | Is data availability statement | no |
|
Figure 1 .
1M. bovis infection reduced the viability of EBL cells. Cell viability after M. bovis infection at different multiplicity of infection (MOI) of 10, 20, 30, 50, 60 and 80 and different time of 24 h, 48h and 72 h, respectively, were detected using MTT assay. It was determined by calculation that the half-maximal inhibitory concentration (IC50) at 24 h, 48 h and 72 h was 60, 60 and 50 MOI, respectively. The **data** are expressed as mean ± SEM. Experiments were repeated at least three times. *** p < 0.001 indicated statistical significance. | Is data availability statement | no |
|
In this regard, the independent variable is a dummy variable, coded as 0 for the pre-interruption period, and 1 for the post-interruption period. As a result, there are 114 weeks of pre-COVID-19 (i.e., preinterruption) **data** and 42 weeks of post-COVID-19 (i.e., post-2 Examples include addressing code and ordinance violations, environmental and business regulations, as well as working with the department's crime analysis unit and local businesses to reduce crime. interruption) data. A visual examination of the trend line **data** (see Fig. 1) also indicated the presence of a step change-an abrupt effect-between the emergence of COVID-19 and the death of George Floyd for two of the ten categories: the patrol and DRT self-initiated activities. For both categories an increase in activity was observed after the COVID-19 starting point, followed by an abrupt decline following the date of George Floyd's death. To control for these, two indicators were created to indicate the period from COVID-19 to the incident (March 12, 2020 to May 24, 2020, covering 11 weeks), and the period from the death of George Floyd (May 25, 2020) to the end of 2020 (covering 13 weeks) for these two types of self-initiated activities. | Is data availability statement | no |
|
Shortly following reopenings across the country in late May 2020, the socioeconomic component of the SVI became an independent predictor of worse COVID-19 outcomes and follows a similar trend to the overall SVI throughout the duration of our analysis. There is emerging evidence using cell phone **data** demonstrating that lowincome communities have been less able to socially distance during the COVID-19 pandemic, likely due to a multitude of factors including less capacity to work from home, or to take paid or unpaid time off from work, and limited savings. 24 25 During the COVID-19 pandemic, **data** have suggested that Hispanic communities in the USA are particularly vulnerable to financial insecurities compared with other racial/ethnic groups due to their disproportionate representation in industries that have been most affected by the pandemic and having jobs that cannot be performed from home. 26 We notice that temporal trends in incidence and death per capita for communities with greater proportion of Hispanic residents closely mirror that of the socioeconomic component of the SVI and thus low SES may partially explain why Hispanic communities have had the worst overall COVID-19 outcomes for the duration of our analysis. | Is data availability statement | no |
|
Districts 1, 2, and 9 were the only close races, and those districts flipping from one party to the other changes the Declination substantially, from δ ≈ −0.228 to δ ≈ 0.515. One might ask whether this could possibly happen in practice, to which we answer a definitive yes. Indeed, the **data** for Election 1 in Table 4 is nearly precisely the outcome of the 2012 US Congressional election in Arizona, where the percentages are the Democratic party's vote share. (We say nearly precisely because some districts had more than two parties receiving a substantial number of votes). If Districts 1, 2, and 9 had flipped, Election 2 could have easily been an outcome. Clearly the Declination is still subject to volatility, and in at least one state the kind of volatility that the Declination is susceptible to actually does occur. | Is data availability statement | no |
|
Our previous investigation showed that in P. gingivalis several genes associated with oxidative stress and sodium translocation were part of the RprY regulon, suggesting a possible role in Na + metabolism and the response to redox changes [9]. The goals of the present study were to determine the activation signal(s) for rprY and clarify its role in the oxidative stress response. We found that the regulator was essential for growth of P. gingivalis under sodiumlimited growth conditions. While the parent strain adapted and was able to grow, albeit slowly, under these conditions, an rprY mutant strain could not, indicating that RprY was essential for the response to this stress and hence viability. By transcription profiling and metabolite analyses we determined that sodium limitation induced an oxidative stress response in both parent and rprY mutants strains. However, the response was highly amplified in the mutant and was accompanied by dysregulation of genes encoding protein chaperones. We found that RprY interacted directly with the promoters of several chaperone genes, and collectively our **data** indicate that the regulator acts as a repressor of their expression. We hypothesize that in the absence of RprY, P. gingivalis is unable control oxidative stress and protein chaperone responses compromising growth of the mutant under stress conditions. | Is data availability statement | no |
|
Nanoscale liquid chromatography coupled to electrospray ionization Fourier transform ion cyclotron mass spectrometry (FTICR MS) was performed on an Agilent 1100 nanoflow system (Agilent Technologies, Santa Clara, CA, USA) hyphenated to a LCQ-FT 7.0 (Thermo Scientific). A volume of 5 mL from the reconstituted SCX fraction was injected automatically and loaded onto a C18 PicoFrit column (75 mm ID/15 mm tip ID, New-Objective, Woburn, MA) packed directly inside the electrospray needle tip using specially designed nanospray emitter tips. A water/formic acid (FA)/acetonitrile (ACN) solvent system was used where solvent A was 0.1% FA and solvent B was 100% ACN, 0.1% FA. Gradient elution was performed: 0% B for 10 min, 0% B to 50% B for 100 min, 50% B to 90% B for 5 min, 90%B for 5 min. Peptide elution was followed by ESI FTICR MS and tandem mass spectrometry (MS/MS) for peptide sequencing. A fullscan spectrum was acquired at high resolution (FWHM = 100000) using the FT analyzer. Data dependant acquisition was applied for MSMS precursor selection, where the 5 most intense mass peaks were subjected to subsequent isolation and collisioninduced fragmentation in the ion trap. Annotated fragment spectra were subjected to a database containing sequences of known rat neuropeptides (SwePep peptides, 245 peptide entries, www.swepep.org,) and neuropeptide precursor proteins (Swepep precursor, 123 protein entries) using the Mascot search engine (v 2.2, matrixscience, London, UK). For peptide identification the search was made with the following specifications: no enzyme specificity; 10 ppm precursor tolerance and 1.2 Da fragment tolerance; no fixed modifications; variable modifications: C-term. amidation, deamidation, oxidation (M); precursor charges: 2+ and 3+; instrument: ESI trap. Peptide matches with a score above the confidence threshold (p,0.05) were considered to be a significant hit. The false positive identification rate (FPR) was estimated by searching the **data** against a decoy database, where the FPR threshold was set to ,1%. | Is data availability statement | no |
|
We see in Figure 3 g that the LG peak dominates the emission at low excitation density in the fully treated sample . We note that the results for low excitation density are consistent with our macro - PL measurements performed at similar fluence ( Figure 1b ) . At higher excitation density ( Figure 3h ) , the initial t = 4 ns PL snapshot is also dominated by the LG emission though the relative fraction of the WG peak is higher than in the lower excitation density case . The PL signal of the LG peak plotted over the charge - carrier density for the fully treated sample also shows deviation from a bimolecular dependence ( Figure 3i ) . Since the LG domains grow in size as the light treatment continues toward completion ( cf . Figure 2h ) , we expect less Auger Adv . Mater . 2019 , 31,1902374 ) ( g , h ) . c , f , i ) The initial peak intensity corresponding to WG and LG energies just after pulse excitation ( which were extracted from fits to the spectra with Gaussian functions ) as a function of charge excitation density . Dashed lines in ( c ) denote a second - order recombination rate ; in ( f ) and ( i ) , these bimolecular lines are shifted appropriately as a guide to the eye to the appropriate data set . recombination in the fully treated film as carriers are in fact less concentrated compared to the partially treated sample . Nevertheless , Auger recombination does still reduce the fraction of LG ( compared to WG ) at higher fluence , which is indicative of significant carrier accumulation in the LG - rich surface . Again , we suggest that the slight deviation of the WG from the bimolecular dependence is due to fast energy transfer from the WG to LG regions at high excitation fluence . In addition , we again observe a decrease in the LG peak fraction after the initial pulse as time proceeds in the fully treated film ( Figure 3g , h ) , which is due to enhanced effective charge densities within the LG regions . | Is data availability statement | no |
|
Our initial analytical focus was on the differences in ecosystem carbon budgets when comparing symmetric versus asymmetric climate change. For this, we used four different scenarios 20 : ambient scenario corresponding to the historical recorded temperature **data** during the period of 1961-1990 (T amb ), symmetric warming (T sym ), double asymmetric warming (T asy2 ) and triple asymmetric warming (T asy3 ). The three scenarios for temperature increases were based on a combination of recorded recent temperature increases (SI: Figure S2) and the predicted future magnitude of temperature increases simulated by a regional climate model (RegCM3) under the A2 IPCC CO 2 emission scenarios (SRES A2) 40 (SI: Figure S3). In the second step, the interactive effects of changes in temperature, precipitation, and atmospheric CO 2 concentrations were investigated. The precipitation treatment had two levels: an ambient level corresponding to the historical mean precipitation amounts recorded during the period of 1961-1990 (P amb ), and precipitation change based on the 2071-2100 predictions from the RegCM3 (P cha ) 40 . The model MT-CLIM (Version 4.3) was used to compute meteorological variables not included in the standard weather station records and required by the Biome-BGC model 41 . The CO 2 treatment also had two levels: an ambient level corresponding to the historical concentrations recorded during the period of 1961-1990 (C amb ) based on the Mauna Loa measurements (http://co2now.org/), and a scenario taking into account the gradual predicted increase in atmospheric CO 2 concentrations from 626 ppm v in 2071 to 836 ppm v in 2100 (C inc ) as predicted by the SRES A2 emission scenario **data** 42 . | Is data availability statement | no |
|
In a jack-knife analysis, given a sample of observations and a parameter to evaluate, a subsample is made by eliminating a proportion of the original **data** and the parameter is calculated for the subsample. This procedure is repeated n times and summarized. Since the introduction of the jack-knife (Quenouille 1949 ), researchers have used it, to defi ne limits of confi dence in many sorts of analyses, from statistics (Efron 1979 ;Smith and van Belle 1984 ) and ecology (Crowley 1992 ) to phylogeny. It has been used not only as a measure of support (Lanyon 1987 ), but as a way to obtain the best solution for large **data** sets (Farris et al. 1996 ), to test competing hypotheses (Miller 2003 ), to generalize the performance of predictive models or for cross-validation to estimate the bias of a estimator. As the bootstrapping, it could be seen as "a measure of robustness of the estimator with regard to small changes in the data" (Holmes 2003 ). | Is data availability statement | no |
|
Finally, we investigated the phosphoinositide kinase family, PIP5K (43). Within the PIP5K family, subfamilies can be distinguished, which include PIP5K (isoforms a,b,c) and PIP4K (isoforms a,b,c), which synthetize PI 4,5 P 2 (44), and PIKfyve, responsible for PI 3,5 P 2 formation (45). PI 4,5 P 2 modulates the initial steps of endocytosis through different mechanisms, including the modulation of actin polymerization by sequestering of actin binders (46); recruitment of proteins that mediate membrane curvature (e.g., BIN1) (47), and interaction with specific heparan sulfate proteoglycans (HSPG), glycancoated proteins that are reported to interact with K18 PFFs (17). On the other hand, PI 3,5 P 2 regulates endosomal vesicle maturation, in particular the maturing steps between the multivesicular body (MVB), late endosomes and lysosomes, including associated recycling events (48). We decided to target the different subfamilies of PIP5K using LV-shRNAs and test how these affect the process of K18 PFFs internalization using live-cell imaging. After imaging, samples were lysed and effective knockdown was confirmed through RT-qPCR. We tested the efficacy of three distinct LV-shRNAs in knocking down Pikfyve. LV-shRNA1 and 3 efficiently knocked EDITORS' PICK: PIKfyve mediates tau aggregates transport and its seeding down Pikfyve resulting in a concomitant significant decrease in K18-pHrodo signal in neurons. LV-shRNA2 was less effective in the knockdown of Pikfyve, giving rise to likewise less K18-pHrodo signal. (Fig. 3, B and C). Furthermore, none of the control LV-shRNA used could affect Pikfyve mRNA expression of K18-pHrodo signal. These **data** suggest that PIKfyve function may be an important regulator in K18 PFFs internalization. Data on the effect of other PIP5K family knockdown can be consulted in Figure S5. Because the PIKfyve loss of function elicited the highest reduction in the K18-pHrodo signal, we decided to continue our study focusing on PIKfyve. | Is data availability statement | no |
|
But the problem here is not publication bias per se . Any experimental activity ( including but not limited to publication ) that involves preferentially following up on one 's most statistically promising initial results presents the same challenge . For example , a standard study design in genetics is to scan the genome , one position at a time , for statistical evidence of a genetic variant with an effect on some phenotype in one or more families , in order to find the best - supported genomic position ; and then to follow up with additional data , e.g. , using a new set of families , in order to corroborate the result at that position . This procedure seems scientifically unassailable , but any attempt at evidence amalgamation across the two stages of the study violates the same statistical assumption as does publication bias . When following up on our most promising findings , we can expect the p value to regress to the mean regardless of whether the evidence is going up or down , by virtue of having selected a location for follow - up on the basis of a notably small p value . | Is data availability statement | no |
|
When we compared the non - normalized data to the data normalized to PMMoV , a difference was noticed , especially in the facilities that serve a larger population . On the other hand , while comparing data normalized to the PMMoV and data normalized to the flow , the results were very consistent . | Is data availability statement | no |
|
In order to increase the internal reliability of the study, the process of **data** coding was conducted by two researchers. After the coding process was complete, the coefficient of consistency was calculated using Kappa analyses. The arithmetical average of the Kappa coefficient was found to be 77.8%, which proved that consistency was at a significant level. In order to ensure the external reliability of the study, all of the **data** collection tools, raw data, coding, interview and observation notes were submitted for supervision and confirmation by an independent expert. | Is data availability statement | no |
|
Policy and legal implications. Any AI computational system aiming to detect dark patterns should align to detectable issues that are already deemed illegal by authoritative sources. But there are only few (mandatory) legal rules in Europe constraining the use of dark patterns, and enforcement is slow in holding websites accountable. The only mandatory decision ascertaining any UI based aspect is dated of 2019 forbidding the use of pre-ticked boxes (of Justice of the European Union, 2019). From Article 7(3) of the GDPR ("it shall be as easy to withdraw as to give consent", it can be interpreted that privacy choices should be equal (e.g. parity in accept, reject and revoke choices) Nouwens et al., 2020b). Parity feature entails i) equal widgets, ii) equal number of times to either accept/reject/revoke consent, iii) across modalities (web, mobile and app setting levels) (Johanna Gunawan, 2021). This reasoning on feature parity needs still to be held definitive by court decisions as well for consistency in all EU. The ePrivacy Regulation draft 14 , being discussed in the European Council, as of today, is absent on the definition of dark patterns or UI features, even accepting the use of cookie walls (Council's version), considered as an onstructive dark pattern (Kretschmer et al., 2021;Gray et al., 2021). Such weak enforcement and the high rate of consent optimization enhanced by using faulty designs in cookie banners (Hils et al., 2020; at scale, facilitated by the use of consent management platforms, explain the recurrent use of dark patterns in cookie banners. In the future, we need to see a more serious approach to enforcement, either by courts, or by decisions issued by **data** protection authorities. That is the only way to ensure that automated systems can rely on the necessary legal certainty in identifying dark patterns by identifying concrete characteristics of design. | Is data availability statement | no |
|
We can apply the same distributed algorithms also for DP variational inference Jälkö et al . ( 2016 ) ; Park et al . ( 2016 ) . These methods rely on possibly clipped gradients or expected sufficient statistics calculated from the data . Typically , each training iteration would use only a mini - batch instead of the full data . | Is data availability statement | no |
|
The novel biomarker ki:e SB-C was systematically evaluated to determine fit for purpose by following the V3 framework. One important aspect and strength of digital biomarkers in general, and ki:e SB-C in particular, is automation. Speech **data** can be easily collected by telephone or by mobile front-ends face-to-face, and ki:e SB-C can be reliably calculated using the proprietary automatic speech analysis pipeline. Verification step of this study revealed that the automatic speech processing pipeline, including speech recognition, performs at an acceptable level across different languages and tasks, and therefore, automatic processing works reliably for the purposes of the biomarker. The aim of the analytical validation was threefold: to evaluate (1) the ki:e SB-C against a cognitive gold standard measure (MMSE), (2) the biomarker's retest reliability, and (3) how well ki:e SB-C reflects agerelated -but clinically not relevant -changes. The results of the analytical validation analyses revealed that ki:e SB-C was a valid biomarker to measure cognitive abilities that are relevant for the target population, as seen by its high correlation with MMSE scores even if corrected for the effect of age. Furthermore, the automatically calculated biomarker was stable in retesting as assessed by a test-retest analysis. It is well established that aging is a risk factor for cognitive decline and diseases [26]; however, not all changes are pathological. It is important for a biomarker that is a surrogate for cognition to reflect subtle changes that are at a subclinical level. Our results showed that ki:e SB-C can detect age-related changes in cognitive function. These changes are also reflected in MMSE scores as a trend. Regarding the direct comparison with the MMSE on the healthy DeepSpA population, the SB-C seems to vary more marked by a larger standard deviation (regarding the mean) as compared to the MMSE. This might be also due to the fact that the SB-C is a more fine-grained measure as compared to the MMSE, which might especially in the healthy population lack the resolution to measure cognitive changes at this higher functioning level and would elicit less variance. | Is data availability statement | no |
|
The speckle pattern from the subset window can be tracked with high accuracy by using the proposed INCC procedure . In contrast to a 1D line profile provided by the LTP and the NOM , SAM provides a 2D map of the slope in a single scan by dividing each speckle image into multiple subset windows and performing the pixel - wise analysis perpendicular to the scanning direction . Moreover , unlike the LTP and the NOM , SAM is able through the proposed data analysis procedure ( Mode 2 ) to test strongly curved mirrors by measuring the first derivative of the slope . | Is data availability statement | no |
|
Figures 6 ,
67 and 10 show the best-fit regions in the (M, σv ) plane for DM annihilations and in the (M, τ ) plane for DM decays. Each panel assumes a specific mode and a specific DM profile. In each panel the red regions are favored by the global fit of FERMI, HESS and PAMELA **data** at 3 and 5σ (2 dof).
| Is data availability statement | no |
|
A two-tailed probability value of <0.05 was considered statistically significant. The **data** were processed using MedCalc Software version 12.2.1 and SPSS version 20.0. | Is data availability statement | no |
|
Initial diffraction **data** were collected at the Diamond Light Source, Didcot, UK. Diffraction **data** for the deposited structures were collected at the European Synchrotron Radiation Facility, Grenoble, France. Data were processed with XDS (25) and scaled with AIMLESS (26). The structures were solved by molecular replacement with Phaser (27), the AadA-ATP structure with Protein Data Bank (PDB) code 5g4a (12) as search model and the antibiotic complexes with AadA-ATP as search model. All structures had two molecules in the asymmetric unit. The structures were refined by reciprocal-space refinement in Phenix (28) and model building and real-space refinement in Coot (29). Statistics for **data** processing and refinement are shown in Table 2. The structure factors and refined coordinates have been deposited in the PDB. | Is data availability statement | no |
|
Radiocarbon dating was carried out at the Oxford Radiocarbon Accelerator Unit (ORAU, RLAHA, University of Oxford, Oxford, UK). Chemical pre-treatment, target preparation, and accelerator mass spectrometry measurement were performed according to Ramsey et al (11)(12)(13). Calibration was performed using the IntCal04 **data** set (14). | Is data availability statement | no |
|
Eight children in our study could not continue their treatment after T2. This attrition seriously affected the power of the study. This is especially relevant to the TAU group, in which one third the children did not participate or had incomplete **data** sets at T3, compared to one-fifth of the children in the Novel group. | Is data availability statement | no |
|
Similarly , our in vitro anti - inflammatory bioassay data was corroborated by in vivo improvement in inflammatory milieu observed in FPF - treated ischemic wounds ( Figures 6 and 7 ) . The incessant and elevated presence in proinflammatory cytokines CINC-1 , CINC-2 , CINC-3 , LIX , and IL-6 , being crucial neutrophil traffickers , cause exacerbated levels of neutrophil incidence , resulting in further tissue damage [ 27,[41][42][43 ] . Unlike FPF - treated wounds , control animals exhibited a significant increase in neutrophil chemoattractants that correlated with the upsurge in neutrophils to the wound bed ( Figure 6B ) . Downregulation of genes , including Ccl12 , Cxcl1 , Cxcl3 , IL-1β , IL-6 , Ptgs2 , and TNF , that contribute to inflammatory responses were observed in FPF - treated animals [ 44][45][46][47 ] . Furthermore , adhesion molecules L - selectin and JAM - A that are critical in neutrophil migration and activation were reduced upon FPF treatment . Both L - selectin and JAM - A heavily influence neutrophil extravasation and infiltration into the ischemic tissue environment [ 48][49][50 ] . Levels of eotaxin , which induces recruitment of not only eosinophils , but also basophils , neutrophils , and macrophages , were significantly reduced in FPF - treated animals [ 51 ] . We also saw significantly lower levels of MIP-1α , RANTES , TREM-1 , and activin A , which are critical macrophage chemoattractants in wound repair [ 52][53][54][55 ] . TWEAK signaling further modulates inflammatory responses and enhances the production of proinflammatory cytokines , including RANTES [ 56 ] . While the expression of RANTES and its functional response to MIP-1α are enhanced during differentiation of monocytes to macrophages , activin A alters macrophage polarization by promoting a proinflammatory M1 phenotype and inhibiting the acquisition of anti - inflammatory M2 macrophage markers [ 55,57 ] . Although we did not find significant differences in the total number of macrophages , we detected a substantially higher number of M2 macrophages in FPF - treated animals ( Figure 6C ) . Concurrently , we also found that FPF treatment led to an increase in fractalkine / CX3CL1 , which induces VEGF - mediated angiogenesis potential in CX3CR1 - expressing M2 macrophages [ 58 ] . | Is data availability statement | no |
|
Sweet potato cv. Xushu 18 was used in this study. Plants were grown on test field under natural conditions in May of 2009 at Sichuan University. For 2-DE analysis, we harvested the unexpanded young leaves and the fully expanded mature leaves ( Supplementary Fig. S1) at midday and frozen immediately in liquid nitrogen. For cloning of the two RCA isoform genes, the fully expanded leaves of sweet potato were collected. To investigate whether mRNA's expressed pattern of Ib-RCA in leaf tissue is regulated by light, we harvested mature leaves every 2 h throughout a 24 h period. In order to investigate the temperature responses of Ib-RCA genes, sweet potato growing in field was transplanted to climate box (25°C under a 14/10 h light/dark photoperiod and 200 lEm -2 s -1 ). After continuous light (25°C, 200 lEm -2 s -1 ) treatment for 2 days, sweet potato was then treated at different temperatures (20, 25, 30, 35°C and 200 lEm -2 s -1 ) for 24 h. We sampled and frozen immediately in liquid nitrogen, then stored at -80°C until use. Sequences **data** from this article have been deposited at GenBank (http://www.ncbi. nlm.nih.gov) under accession numbers JQ923423-JQ9234231. | Is data availability statement | yes |
|
Jet and dijet cross-section measurements allow for a good test of perturbative QCD (pQCD) in p-p collisions. The ATLAS collaboration performed such doubly differential measurements using a total integrated luminosity of up to 3.2 fb −1 at a centre-of-mass energy of √ s = 13 TeV [4]. Reconstructed jets were formed using the anti−k T algorithm with a distance parameter R = 0.4 [5], and using EM-scale calorimeter clusters as the inputs. The inclusive-jet results cover a wide kinematic range in jet transverse momentum (p T ) from 100 GeV to 3.5 TeV and for several separate ranges of jet rapidity up to |y| = 3.0. In the case of the dijet cross-section measurements, the results are presented as a function of dijet invariant mass (m jj ) in the range 300 GeV < m jj < 9 TeV, and for values of the quantity y * = 1 2 |y 1 − y 2 | -half the absolute rapidity difference between the two leading jets 1 -up to y * = 3.0. The choice of bin sizes (for p T , y, m jj and y * ) was motivated by the respective detector resolution. The measured results from **data** were compared to next-and in some cases next-to-next-to-leading order (NLO and NNLO, respectively) pQCD predictions, and in all cases were found to be in good agreement with the SM predictions. The inclusive-jet and dijet results are summarized in Figure 1, where the usual convention is adopted of applying multiplicative offset factors to the results in various |y| (y * ) bins in order to allow for a better visual comparison of the shapes. Doubly and triply differential jet cross-section measurements were also performed by the CMS collaboration [6,7,8]. Figure 1: Results of the measured (a) jet and (b) dijet differential cross-section results at √ s = 13 TeV by the ATLAS collaboration [4]. | Is data availability statement | no |
|
We found that the reporting is mostly not case - specific , and data ( on economic , environmental and social dimensions ) are aggregated - that is , numbers that are reported , such as GDP growth or land restoration figures , tend to be Table 2 References to a selection of CSR guidelines and standards that Salini Impregilo - WeBuild pledges to comply with UN Business & Human Rights Guiding Principles " The responsibility to respect human rights requires that business enterprises : ( a ) Avoid causing or contributing to adverse human rights impacts through their own activities , and address such impacts when they occur ; ( b ) Seek to prevent or mitigate adverse human rights impacts that are directly linked to their operations , products or services by their business relationships , even if they have not contributed to those impacts . " ( UNOHCHR , 2011 , p 14 ) UN Sustainable Development Goals ( https:// sdgs . un . org/ goals ) " End poverty in all of its form everywhere " ( # 1 ) " End hunger , achieve food security and improved nutrition and promote sustainable agriculture " ( # 2 ) " Ensure healthy lives and promote well - being for all " ( # 3 ) " Ensure availability and sustainable management of water and sanitation for all " ( # 6 ) " Ensure access to affordable , reliable , sustainable energy for all " ( # 7 ) " Protect labour rights and promote safe and secure working environments " ( # 8.8 ) " Protect , restore and promote sustainable use of terrestrial ecosystems , sustainably manage forests , combat desertification , and halt and reverse land degradation and halt biodiversity loss " ( # 15 ) " Promote peaceful and inclusive societies [ … ] , provide access to justice for all and build effective , accountable and inclusive institutions at all levels " ( # 15 ) UNI EN ISO 9001 ( quality management system standards ) " When planning for the quality management system , the organization shall [ … ] determine the risks and opportunities that need to be addressed to [ … ] prevent , or reduce , undesired effects " ( ISO 9001:2015(E ) , 6.1.1 , p. 4 ) UNI EN ISO 14001 ( environmental management system standards ) | Is data availability statement | no |
|
Based on the **data** in this study, we conclude that the Indonesian version of NoMoPhobia Questionnaire (NMPQ) meets psychometric aspects of measurement. NMPQ has a stable reliabilities and validities and can be used to measure the degree of connection between the mobile phone and human interactions.
| Is data availability statement | no |
|
All **data** were presented as mean±SE. Different groups were compared by Student-Newman-Keuls test with SigmaStat 2.03 software (SPSS). P<0.05 was considered statistically significant. | Is data availability statement | no |
|
In edge networks, local **data** processing and storage helps to make it independent of complex network infrastructure [1,2]. Therefore, edge network devices should be protected to avoid major attacks [3,4] such as DDOS, ransomware, man in the middle (MITM) attack, etc. As the paradigm of distributed computing, edge network devices centralize **data** centers and acts as smart things to overcome cloud computing limitations. High-speed **data** networks such as 5G wireless communication have boosted the application of edge devices and increased the vulnerability at the same time. Hence, this work uses authentication protocol to secure the multiple edge device interconnectivity for the Internet of Things (IoT) communication. | Is data availability statement | no |
|
When measuring high-quality diffraction **data** for HPMX experiments, care should be provided in selecting the optimum wavelength. The chosen energy will have a direct influence on the resolution, signal-to-noise ratio, X-ray absorption by cell components, and will be dependent on the detector characteristics, often optimized for a specific set of energies. Ultra-short wavelengths of the order of 0.25-0.05 Å have been reported as being optimal for HPMX studies (Fourme et al., 2001). At AR-NW12A, the shortest wavelength available is 0.70 Å , limited by the cut-off ($ 3.5 mrad) of the focusing mirror reflectivity downstream of the monochromator, rather than by the X-ray undulator source (see Fig. 1 of Chavas et al., 2013). A typical diffraction pattern recorded during HPMX experiments is shown Fig. 3. Although sub-optimal, **data** collection at 0.70 Å should increase the crystal lifetime approximately by a factor of two, when compared with **data** collection at 1.00 Å , in accordance with calculations of the dose-dependent decay of crystals exposed to wavelengths of 0.70 Å and 1.00 Å , respectively, with a conserved incoming flux of 1.0  10 11 photons s À1 mm À2 in an area of 200 mm  200 mm (calculations performed using Raddose version 2.0; Paithankar et al., 2009). Moreover, the ADSC Quantum 210r CCD X-ray detector implemented at AR-NW12A is calibrated to have its detective quantum efficiency (DQE) maximized at the wavelength of 1 Å . Although no major troubles could be observed in the **data** recorded so far with an energy beam of 17.7 keV (0.70 Å ), HPMX at AR-NW12A could improve in **data** quality if a detector calibrated for higher energies was available. The recommended beam size at AR-NW12A varies from 100 mm  100 mm to 200 mm  200 mm, in positive cases where access to sufficiently large crystals is possible. A clear advantage of this large beam size lies in the fact that a larger volume of the crystal is exposed to the X-ray beam, resulting in increased signal at higher energies. A comparison of **data** collected at PF AR-NW12A with **data** recorded at SPring-8 BL41XU (Kawamoto et al., 2001), where a sharper beam was available, clearly suggested that a larger beam size coupled with a lower flux was beneficial for collecting fully usable **data** at room temperature (unpublished results). | Is data availability statement | no |
|
SKA will provide a deep and wide large-scale structure dataset that will enable separating the effects of the early and late universe on the observed CMB anisotropy. For example, the SKA **data** could be used to reconstruct the late-time contribution to the CMB anisotropy via the integrated Sachs-Wolfe effect, and thus provide information about the temporal evolution of the CMB anomalies. | Is data availability statement | no |
|
Fig 1 .
1Galangin **data** crystal showing the color variation. https://doi.org/10.1371/journal.pone.0267624.g001 | Is data availability statement | no |
|
Likert scale results (Figure 1) show a few different trends when comparing pre vs. post-test results. Trends show an increase in feelings that ethical consideration is important and should be integrated into engineering practice. Results show increased perceived understanding of what ethical consideration is, and increased feelings of responsibility for ethical consideration. In contrast, students reported stronger feelings that they were taking the course to satisfy a degree requirement for question 5 in Figure 1, "I don't really know what 'ethical consideration' is." was statistically significant (t(28) = 2.58, p = 0.015). All other differences between pre and post **data** in Figure 1 were not statistically significant. | Is data availability statement | no |
|
Since the order s 4 - avor results are comparable to the order 2 s ones at energy scales close to the threshold in both cases ( and for other values of x ) , it is reasonable to choose the transition scale at a relatively low value , as mentioned earlier in the paper . The band representing the 3 - avor calculation does become wider at large Q for x = 10 2 ( where the absolute values also are lower than the 4 - avor calculation and data ) ; but for x = 10 4 , it remains quite narrow . Thus , the theoretically infra - red unsafe logarithms , ln 1;2 ( = m c ) , do not seem to cause serious problems , at least for very low x. | Is data availability statement | no |
|
In this section we employ the Bayesian framework and methods introduced in the previous sections for the development of a thermodynamic property model for the alpha, beta and liquid phases of Hafnium. In Section 3.1 we give an overview of Hafnium. In Section 3.2 we discuss how the **data** was collected and any corrections which were applied. In Section 3.3 we present our analysis methods and in Section 3.4 the model selection criteria are presented. Our main results are given in Section 3.5. | Is data availability statement | no |
|
Other variables were tracked via survey instruments, to determine their influence on mental states and cognitive performance. Mindfulness was assessed using Tanay and Berstein's [39] State Mindfulness Scale (SMS). This 21-item scale measures the level of experienced mindfulness over a bounded time period, with subscales for bodily (i.e., "I noticed various sensations caused by my surroundings") and mental mindfulness (i.e., "I felt closely connected to the present moment"). The SMS was included to determine the level of experienced mindfulness in the environment and conditions provided. The Connectedness to Nature Scale [40] was utilized to assess the personal relationship of each student to the natural environment with 14-items (i.e., "I think of the natural world as a community to which I belong"). This measure was included to account for emotional and communal ties to natural spaces that may influence neurological reactions and was completed prior to other **data** collection. | Is data availability statement | no |
|
A dry 50 mL Schlenk flask with magnetic stirring bar was charged with 854 mg ( 3.36 mmol , 1.0 eq ) 4-(benzyloxy)isobenzofuran-1,3 - dione ( 11 ) The spectra are in accordance with previously reported data . [ 8 ] ( S)-9- ( Benzyloxy)-1,2,3,11a- The solvent was removed in vacuum and the resulting solid was dried in oil pump vacuum . [ 8 ] Yield : The spectra are in accordance with previously reported data . [ 8 ] Benzyl ( S)-9-(benzyloxy)-5,11 - dioxo-2,3,11,11a - tetrahydro-1H - benzo saturated NH 4 Cl solution were added and the mixture was stirred for 5 min . The mixture was diluted with CH 2 Cl 2 ( 50 mL ) and the solid was removed via filtration through a sintered glass frit and was rinsed with CH 2 Cl 2 . The combined organic phases were dried over Na 2 SO 4 , filtrated and the solvent was removed in vacuum . The product was purified via column chromatography ( 60 g silica gel , size : 20 x 3 cm , cyclohexane / CH 2 Cl 2 /acetone = 10:10:1 | Is data availability statement | no |
|
Using **data** from 372 microsatellite loci typed in 493 unrelated persons from four major ethnic groups in Nigeria and Ghana, we sought for evidence of population structure using several methods. Our results did not show any significant population substructure and no ethnic group corresponded to inferred clusters. This finding has been -652040 0 -647265 0 reported by others [5]. Although Rosenberg et al observed significant population structure among six African groups (Bantu-Kenya, Mandenka, Yoruba, San, Mbuti Pygmy and Blaka Pygmy), they reported that inferred clusters for some of the African populations did not correspond to predefined groups, unlike groups from America, Oceania and Eurasia [5]. | Is data availability statement | no |
|
are in better agreement with the data . Note that while the Tevatron bounds are somewhat sensitive to the assumption that all the SM fermions are localized close to the Planck brane due to possible variations in the width of the W 2 and Z 2 , this is not true for those from LEP . | Is data availability statement | no |
|
Based on DEM data, morphometric maps, such as shaded relief, aspect and slope degree, minimum and maximum curvature or plan convexity maps using ENVI and ArcGIS software were created. Morphometric maps, such as slope, hill shade, height level and curvature maps, were generated based on the SRTM and ASTER GDEM Digital Elevation Model (DEM, 30-m spatial resolution) **data** using ArcGIS/ESRI and ENVI/EXELIS digital image processing software. The Shuttle Radar Topography Mission (SRTM) obtained digital elevation **data** to generate a high-resolution digital topographic database. SRTM was an international project spearheaded by the National Geospatial-Intelligence Agency (NGA), NASA, the Italian Space Agency (ASI) and the German Aerospace Center. Similarly, the Ministry of Economy, Trade and Industry (METI) of Japan and the United States National Aeronautics and Space Administration (NASA) jointly released ASTER GDEM. | Is data availability statement | no |
|
Gaussian linear dynamical systems ( LDS ) provide very efficient learning and inference algorithms , but they can typically only be applied when the observations are themselves linear with Gaussian noise . While it is possible to apply a Gaussian LDS to count vectors ( Belanger and Kakade , 2015 ) , the resulting model is misspecified in the sense that , as a continuous density , the model assigns zero probability to training and test data . However , Belanger and Kakade ( 2015 ) show that this model can still be used for several machine learning tasks with compelling performance , and that the efficient algorithms afforded by the misspecified Gaussian assumptions confer a significant computational advantage . Indeed , the authors have observed that such a Gaussian model is " worth exploring , since multinomial models with softmax link functions prevent closed - form M step updates and require expensive " computations ( Belanger and Kakade , 2014 ) ; this paper aims to help bridge precisely this gap and enable efficient Gaussian LDS computational methods to be applied while maintaining multinomial emissions and an asymptotically unbiased representation of the posteiror . While there are other approximation schemes that effectively extend some of the benefits of LDSs to nonlinear , non - Gaussian settings , such as the extended Kalman filter ( EKF ) and unscented Kalman filter ( UKF ) ( Wan and Van Der Merwe , 2000;Thrun et al . , 2005 ) , these methods do not allow for asymptotically unbiased Bayesian inference and can have complex behavior . Alternatively , particle MCMC ( pMCMC ) ( Andrieu et al . , 2010 ) with ancestor resampling ( Lindsten et al . , 2012 ) is a very powerful algorithm that provides unbiased Bayesian inference for very general state space models , but it does not enjoy the efficient block updates or conjugacy of LDSs or HMMs . | Is data availability statement | no |
|
We use the dates of the declaration of COVID-19 as a global pandemic (March 11, 2020) and of the announcements of the effectiveness of the Pfizer-BioNTech (November 9, 2020), Moderna (November 16, 2020) and AstraZeneca (November 23, 2020) 1 vaccines in the US as event dates to compute abnormal returns (ARs). The **data** used in the event study was collected from different sources. Airline stock returns were obtained from Datastream and computed using the total return index. | Is data availability statement | no |
|
Other genes identified by the network analysis have been previously linked to OA , but have not been studied for their potential as biomarkers / therapeutic targets . For example , Apolipoprotein D ( APOD ) was found to be downregulated in every zone of the OA cartilage . It was also 1 of the 207 mRNAs identified as significantly dysregulated from the meta - analysis and was the top - ranked gene of the MI network . Research has previously implicated APOD as being an important gene in OA pathogenesis . For example , APOD is strongly upregulated by retinoic acid [ 27 ] , which is in turn regulated by ALDH1A2 - an OA risk locus [ 28 ] . In vitro studies have shown APOD to be upregulated upon SOX9 overexpression , a master transcription factor essential for cartilage ECM formation [ 29 ] . Furthermore , a recent study into the identification of knee OA genes shared by both cartilage and synovial tissue proposed that APOD may manage OA through chondrogenesis in articular cartilage and immune regulation in the synovium [ 30 ] . The high ranking of APOD in the MI network makes it an interesting candidate for future studies into OA . In particular , research should investigate its potential as a biomarker / drug target . Fibrillin-1 ( FBN1 ) was another gene that was discovered from analysis of the MI network whose encoded protein was also found to be upregulated in the middle and deep zones of OA cartilage according to the MS data . This is particularly interesting as FBN1 is the causative gene of the inherited connective tissue disorder Marfan syndrome [ 31 ] . Moreover , it was 1 of 300 proteins identified via lectin - affinity chromatography in a previous study investigating the proteome of human OA synovial fluid [ 32 ] . The fact that this gene encodes microfibrils that play a structural role in all connective tissues , and mutations in which are known to cause a disease of the musculoskeletal system , warrants further investigation of its role in OA . | Is data availability statement | no |
|
YY contributed to the conception, **data** curation, formal analysis, and manuscript writing and editing. ZH contributed to the conception and revision of the manuscript. JL contributed to the review and revision of the manuscript. YW contributed to the editing of the manuscript. MX contributed to the editing and revision of the manuscript. All authors contributed to the article and approved the submitted version.
| Is data availability statement | no |
|
Histological sections were performed to further investigate the anatomical development of stamens and carpels. Lepidium naufragorum [see Additional file 2] and L. tenuicaule revealed comparable reproductive organ development with no evidence of loss of organ function. Therefore, only **data** from L. tenuicaule is compared here against L. sisymbrioides ( Figure 6). | Is data availability statement | no |
|
In Fig. 7, we report the average BDMC gap on 2500 simulated **data** examples, and observe that q-paths with q = 0.994 or q = 0.996 consistently outperform the geometric path as we vary the number of intermediate distributions K. | Is data availability statement | no |
|
All **data** were taken from the Protein Data Bank [29,30]. The analysis was limited to protein X-ray crystal structures. Multi-model refinements and structures containing only Cα atoms were discharged. Only structures refined at a resolution better than 2.5 Å were retained, and this resulted in a list of about 121,000 entries of the Protein Data Bank. These were randomly divided into 14 subsets, each containing 7000 entries. Each entry was contained in one subset only (no overlap). All the analyses were then performed on each of the 14 subsets. | Is data availability statement | no |
|
The message from this study is clear. p-XRF's role in characterising coarse wares such as Impasto and Cooking wares begins after their fabrics have been defined, either macroscopically or petrographically, p-XRF's considerable attributes then come into play providing in the field rapid analyses of the cut (sawn) surface of large numbers of sherds; this can be viewed as a 'screening' procedure generating chemical **data** which is scrutinised in the light of the fabric classification to make useful, if broadly based statements about identity, for instance precisely those elucidated in this studyvolcanic vs. non-volcanic or local vs. non-local. The chemical classification should not be expected to be amenable to a more detailed level of interpretation, paralleling the outcome of the other p-XRF investigation at San Vincenzo on soils from different excavated contexts (Di Renzoni, et al., 2016). For sure, there is room for the analytical protocol to be refined beyond that described in the present study in order to improve accuracy, to allow additional elements to be included in the **data** set and to calibrate with respect to the corresponding **data** obtained by destructive analysis such as benchtop WDXRF (Jones & Campbell, in press). We suggest that p-XRF should not be habitually compared to ICP/NAA and repeatedly be found wanting in its inability to cover all elements and /or display equivalent levels of accuracy for all elements. At the same time p-XRF should not develop a 'scatter gun' reputation, generating one-off **data** which cannot be reused or can only be used for internalin-house -purposes. It is time to look boldly at its greatest attribute, that is, that of non-destructive, in situ, analysis aimed at analysing large quantities of sherds; equally important is our ability to formulate appropriate questions that it can answer, satisfactorily and conclusively. In the case of impasto ware there is already a good understanding of the origin and production of these wares; so the need now is to process large quantities of newly excavated material, and to assess whether they are imported or not. This study has shown that this type of assessment (local vs non local) is indeed feasible and effective. Coarse wares are amongst the most common fabrics found in many archaeological sites so the implication of the results of this study goes beyond the remit of our work on Stromboli.
| Is data availability statement | no |
|
In this study , we find that prostate cancer patients with menin overexpression show poor overall survival . Although menin has been extensively characterized as a tumor suppressor in multiple endocrine neoplasia type 1 17 , our data , and previous literature on estrogen receptor , strongly argues that menin can facilitate oncogenic gene activation through hormone receptor signaling in a contextual manner . | Is data availability statement | no |
|
The following measures of functional skills at our 5-year follow-up were chosen from an extensive battery because they reflected the outcomes of interest (academics, social and peer relations, global impairment) as well as a variety of **data** collection methods, including objective testing as well as parent and teacher reports. | Is data availability statement | no |
|
Two models were estimated, each applying nominal and real exchange rates. Both models adopted the panel-corrected standard errors (PCSE) linear regression method proposed by Beck and Katz [41]. The method takes into account the first-order panel autocorrelation, panel-level heteroskedastic errors, and errors contemporaneously correlated across panels. In this regard, the null hypothesis that there is no first-order autocorrelation was rejected in the Wooldridge test for panel **data** [42]. A Fisher-type panel unit root test that can be applied to unbalanced panels was performed, and the results rejected the null hypothesis that all panels contain a unit root for all variables (Table 3). Since the estimation results of the two models are not significantly different, we focused on the results of the nominal exchange rate model. Time effects are not presented, but almost all are significant, reflecting the shocks common to all countries at some point in time such as Chile's production cuts in around 2010. The three null hypotheses that all countries have the same effect, all the exchange rate effects are zero, and all the tariff rate effects are zero were all rejected at the 10% significance level (Table 4). Therefore, it can be said that country-specific markups were apparent in some importing countries. Table 4. Results of joint significance test. | Is data availability statement | no |
|
Olusanya, Peter and Oyebo (2012) studied taxation as a fiscal policy tool for income reallocation among Lagos state civil servants. They used the spearman's rank correlation coefficient to analyse the **data** they collected. The study revealed that there is a positive relationship between tax as a fiscal policy tool and income reallocation. Engenand and Skinner (1996) who studied the relationship between taxation and economic growth of the United State, establish that there is a modest effect on the direction of 0.2 to 0.3 percentage point changes in growth rates in answer to major tax reforms. Their findings propose that such slight influence can cumulatively have large impact on the standards of living of citizens. | Is data availability statement | no |
|
To investigate the influence of ligand clustering on receptor mediated uptake , a series of eight micelle formulations were used : 0%F-100%mix , the untargeted control containing unfunctionalized LDP ( 0%F ) used as 100 % of the micelle and 10%F-100%mix , 20%F-60%mix , 30%F-40%mix , 40%F-30%mix , 60%F-20 % , 70%F-20%mix and 100%F-10%mix are formulations presenting a similar amount of folate in total . UV - Vis data show statistically similar numbers of folate per micelle by one way ANOVA analysis between the different groups at the 95 % confidence interval ( Figure 2c ) . We used FR overexpressing KB cells to evaluate targeting and binding of the micelles to receptors on the cell surface . After a 24 h period of incubation , the highest cell associated fluorescence was observed for cells incubated with the 20%F-60%mix formulation ( Figure 3a ) . The measured EC50 ( concentration producing 50 % binding ) was observed to be the lowest for the 20%F-60%mix micelle . To facilitate discussion , we approximated the apparent dissociation constants of the tested micelles ( K D ) by fitting the experimental data to a 1:1 binding model for site - specific binding . The apparent micelle K D is given in Table 1 . EC50 and K D values are similar , indicating that there is a direct relationship between binding and the measured fluorescence . The measured dissociation rate constant ( k off ) of the different micelles also show that the optimal 20%F-60%mix formulation had the longest dissociation time ( 2×10 −5 s −1 ) ( Figure 2d , Table 1 ) . To examine the apparent rate of association ( K on ) relationship between the micelles , we used the following equation : k on = k off /K D and found that the calculated values were not significantly different ( less than 1 order of magnitude apart , Table 1 ) . Confocal analysis and competitive binding experiments confirmed that our observations for binding and targeting are FR mediated and that the mechanism for internalization of targeted LDP is dependent on both energy driven endocytosis and the presence of folate receptors ( Supplemental Figure 5 ) . | Is data availability statement | no |
|
Thyroid cancer, the most common endocrine malignancy usually presents as a solitary nodule. FNAB is the best way for the diagnosis of cancer in patients presenting with thyroid nodules [5,6]. FNA cytology reports are classified as: Benign, Follicular lesion of undetermined significance and Follicular neoplasm, Suspicious for malignancy, Malignant and Nondiagnostic [26]. Unlike for papillary thyroid cancer, FNA biopsy cannot differentiate follicular thyroid cancer from follicular adenomas. These specimens are usually reported as indeterminate lesions, categorized as follicular neoplasm or follicular lesions of undetermined significance [7,8]. Pathologic evaluation of the thyroid after surgery confirms the actual diagnosis of follicular thyroid cancer. Treatment in benign and malignant group is straightforward, but in patients with cytology suggesting follicular neoplasm is controversial. Physicians usually perform thyroid scintigraphy and most patients with non autonomous adenomas should undergo surgery because 15 to 25 percent of them prove to be cancers [26]. Many procedures are under investigation to improve the diagnostic value of cytology alone for the assessment of follicular neoplasm. These include RT-PCR measurement of thyroglobulin mRNA, PET scans, cellular and molecular markers [9][10][11][12][13][14][15][16][17][18][19]. More recently, many studies have suggested that Serum TSH is an independent risk factor for predicting malignancy in a thyroid nodule [20][21][22][23][24][25]. Higher serum TSH levels also have been associated with advanced stages of thyroid cancer [23]. Many **data** support the role of TSH in thyroid cancer. Many studies have shown higher incidence of thyroid cancer in patients with Hashimoto's thyroiditis and Graves' disease, compared with control population [27,28]. Elevated TSH due to continuous progression to hypothyroidism in Hashimoto's disease and TSH receptor stimulation by TSH receptor antibody in Graves' disease may explain the higher rate of malignancy in these patients. The trophic effect of TSH on thyroid cancer growth also is well established [29,30] and TSH suppression by administering exogenous thyroid hormone is an independent predictor of recurrence of differentiated thyroid cancer [31]. An alternative explanation is that patients with lower TSH concentrations were developing autonomous function, which by itself is associated with lower rates of malignancy [32][33][34]. | Is data availability statement | no |
|
The interior metric is CAdS in coordinates sec 2 ρ = 1 + r 2 and we have set the AdS scale R to 1 . The induced metrics match on the surface r = r b . This metric is not expected to be a solution to string theory , and initial data corresponding to such a geometry would certainly evolve with time . Nevertheless , any string solution containing a metastable AdS bubble will share certain features of scalar field propagation that we now study . | Is data availability statement | no |
|
As regards the total gravitino relic abundance Ω e G h 2 , we apply the 3 σ range derived from WMAP 5 year data [ 66 ] 0.091 < Ω e G h 2 < 0.128 , ( 3.1 ) which in the figures below will be marked as green bands and labeled " Ω e G h 2 " . As previously in [ 16 ] , we also include the bound on the possible distortion in the nearly perfect black - body shape of the CMB spectrum [ 67 ] by the injection of energetic photons into the plasma . However we note that this constraint ( delineated with magenta line with a label " CMB " over it ) seems generally less important than that due to the BBN [ 68,69 ] . | Is data availability statement | no |
|
After performing the Gaussian fits on short baselines , we derived amplitude and phase self - calibration solutions and applied them to all baselines . For the amplitude self - calibration , we used a total flux density of 1 Jy , which is comparable to the average of historical values at 1.3 cm ( e.g. , Bower et al . 2015b ) . This choice of normalization does not affect our remaining analysis , which only relies on the fractional visibility amplitudes relative to the total flux density . In addition , we averaged the data across all IFs , to maximize our sensitivity . Figure 2 shows the resulting visibility measurements , including upper limits on baselines to Spektr - R ( computed by PIMA ) and highlighting the amplitudes for our most sensitive ground baselines . These baselines are expected to have signals that are dominated by refractive noise ; they have / up to 8.4 after the final incoherent averaging in time , indicating a reliable detection of image substructure . For each figure , we show the expected envelope of the ensembleaverage image and the predicted " renormalized refractive noise"ˆr ef ( u ) = Δˆ(u ) 2 1/2 , whereˆis the complex visibility function of the source after centering the image and normalizing the total flux density ( for details , see Appendix A of J18 ) . | Is data availability statement | no |
|
where θ 0 is the initial angle of the platelet ( Figure S6 , Supporting Information ) . By comparing the predictions of this geometrical model with the experimental data , we find that the theoretical analysis slightly overestimates the absolute θ values but captures reasonably well the dependence of the platelet angle on the applied global strain . A more detailed analysis of the platelet angle as a function of the imposed strain can be obtained by performing finite element simulations of the platelet - laden composite ( Figure 3B and Movie S4 , Supporting Information ) . To enable a direct comparison with the experimental observations , a composite containing platelets aligned out - of - plane was first strained along the direction that was mechanically loaded in the experiment ( x axis ) . Top and side views of the platelets at an applied strain of 100 % show a good qualitative agreement between the simulations and the experimental microscopy images ( Figure 3A , B ) . With the help of these simulations , we also studied the response of the platelets when the global strain is applied parallel to the alignment plane ( y axis ) . Our simulations indicate that this loading configuration does not lead to any significant rotation of the platelets , which explains the anisotropic nature of the strain - induced color change ( Figures 2E and 4A ) . A quantitative comparison between simulations and experiments for composites loaded along the color - changing direction reveals that the finite element analysis correctly captures the experimentally observed trend between the platelet angle and the applied strains . | Is data availability statement | no |
|
The aim of this article was to assess and identify social vulnerability of communal farmers to drought in the O.R. Tambo district in the Eastern Cape province of South Africa using a survey **data** and social vulnerability index (SoVI). Eleven social vulnerability indicators were identified using Bogardi, Birkman and Cardona conceptual framework. The result found that an SoVI estimated for O.R. Tambo district was very high with a Likert scale of 5 for cultural values and practices, security or safety, social networks, social dependence, preparedness strategies and psychological stress attributed for the high value of social vulnerability to drought. Indigenous knowledge and education had an SoVI value of 2, which was of low vulnerability, contributing positively to resilience to drought. The study also found that government involvement in drought risk reduction is limited; as a result, the study recommends that a national, provincial and district municipalities policy on drought risk reduction and mitigation should be developed. White and Howe (2002) argued that there is a realisation that effective natural hazard prevention and mitigation will need to address not only the hydrological-meteorological factors but also the economic and social factors which influence the greater society and reinforce the impact of hazardous events. Wilhite (2005) stated that social vulnerability to drought is increasing at an alarming rate in many parts of the world and South Africa and O.R. Tambo district is not an exception. | Is data availability statement | no |
|
In fact , our data suggest that emotion representations have a prolonged developmental trajectory . Because most studies on the development of emotion perception , emotion experience , and emotion understanding are constrained to childhood , little is known about emotion conceptualization in adolescence . These results reveal that there are continued changes in emotion conceptualization throughout late adolescence and into early adulthood , a finding that prompts new questions about the role of emotion concept development in the social and affective changes that occur during adolescence . 11 Additionally , our methods ensured all participants recognized emotion terms used in our tasks . This is an important methodological advance beyond prior studies , which mostly assumed that child participants understood the terms used in tasks without systematically testing their comprehension . We believed that participants ' responses on these tasks are only interpretable if we had evidence that they had a working understanding of each term , a decision that potentially augments the validity of these results . However , this methodological choice implies that merely having separate definitions for emotion words is not enough to produce a fully multidimensional emotion representation : The sophistication of emotion concepts continues to deepen even after people have learned to associate different emotion definitions with different emotion words . | Is data availability statement | no |
|
Application of the Luttinger theorem introduced in section 3.1 implies that, regardless of the route to a FL, if their magnetic moments are quenched then the f-electrons must be counted in the Fermi surface volume [73]. FS studies [74] using magneto-oscillatory techniques such as the de Haas van Alphen (dHvA) effect have verified that this is so. The most straightforward theory [75] for the FS utilizes the LDA band structure, including the f-electrons. A more sophisticated approach includes the f-electrons but renormalizes the ir scattering phase shifts at the FS to be Kondo-like. This procedure gives heavy masses and often makes only a minor perturbation of the original LDA FS [76]. A crucially important theoretical conjecture by Fulde and Zwicknagl [77,78] is that above TK where the f-electron moment is no longer quenched, the f-electrons should now be excluded from the FS volume. This idea is plausible but there is no proof having the rigor of the Luttinger theorem and in any case it would not hold experimentally if in fact Kondo physics were irrelevant in the heavy Fermion materials, as some workers propose. This section presents ARPES **data** verifying the Fulde/Zwicknagl conjecture. | Is data availability statement | no |
|
We found that ( Figure 3) compared to the yes rates under no context, the mean yes rates averaged over six observers are higher under low contextual contrast C c 0.05 and lower under higher contextual contrast C c ¼ 0.4, for any target contrast C t . We define a contextual facilitation index (CFI) as the average increase in the yes rate in a particular context (relative to no context), specifically CFI [ Mean Ct ½PðyesjC t ; a given contextÞ À PðyesjC t ; without contextÞ; Figure 2A The **data** points are the mean over six observers, and the error bars indicate the standard errors of the means (SEMs). On average and relative to the no-context condition, the weaker colinear contexts C c ¼ 0.01 and C c ¼ 0.05 raised the yes rates by CFI ¼ 38% 6 8% and 15% 6 8%, respectively, whereas the stronger context C c ¼ 0.4 lowered it by ÀCFI ¼ 17% 6 8%. The colored curves are Bayesian fits to **data** of the corresponding color, no fit is done for **data** without context. The root mean square normalized fitting error RMSNFE ¼ 0.66 in the unit of SEM. The fitted parameters (and their 95% confidential intervals) are k ¼ 1.9 (0.6, 3.2), r n ¼ 0.0025 (0.0020, 0.0029), and P(yes) ¼ 0. where Mean Ct ðxÞ [½ P Ct x=½ P Ct 1 stands for the average of x over C t . The weakest context C c ¼ 0.01 raises the yes rate by CFI ¼ 0.38 6 0.08, and the intermediate context C c ¼ 0.05 by CFI ¼ 0.15 6 0.08. In contrast, the strongest context C c ¼ 0.4 lowers the yes rate by jCFIj ¼ 0.17 6 0.08. Averaged over C t , the observers were more than twice as likely to perceive a target in the weakest than in the strongest context. | Is data availability statement | no |
|
Thus , the experimental constraints on x s + are provided essentially by the ν andν dimuon data sets . Following Refs . [ 3,9 ] , we determine the uncertainty range of x s + by the 90 % confidence criteria on the dimuon production data sets . This range is 0.018 < x s + < 0.040 . The two sets of PDFs that represent the best fits corresponding to the lower ( upper ) bound value of x s + will be referred to as CTEQ6.5S1 ( CTEQ6.5S2 ) . The variation of x s + 8 The parton number integral is strongly correlated with the normalization of the ν andν dimuon production data sets compared to the theoretically calculated inclusive charm production cross section . There are various sources of uncertainty on this overall factor : experimental ( global and energydependent ) normalization of the total cross sections ( ∼ 2 − 5 % ) , fragmentation function of charm quark to charmed hadrons , branching ratio of charmed hadron decay to muon ( ∼ 10 % ) , . . . , etc . These are taken into account according to our standard uncertainty analysis . The limits of x s + obtained above correspond to ±20 % ovreall variation of the normalization factor , as determined by this analysis procedure . As the magnitude of x s + varies , the shape of s + ( x ) also adjusts to best fit the global data . A plot of s + ( x ) for these PDFs will be shown in the next section . | Is data availability statement | no |
|
The development of mesoscale circulation depends on the length scale of the land heterogeneity. Mesoscale circulation is sensitive to spatial scale and is typically generated at scales of 10-100 km 16,41 . To investigate the sensitivity of cloud inhibition effect induced by mesoscale circulation to spatial scale, we reestimated ΔCloud using MODIS cloud **data** resampled to different spatial resolutions. We find that with reduced resolutions of cloud data, the spatial coverage of cloud inhibition shrinks from~37% at 0.05°to~24-28% at 1°, while cloud enhancement becomes more dominant (from 63% to~76-72%) (Supplementary Fig. 12 and Supplementary Table 2). This implies that at coarser scales (e.g., typical GCM spatial resolutions), at which mesoscale processes become less important (i.e., less cloud inhibition), observation-and model-based results tend to converge on cloud enhancement of forests. | Is data availability statement | no |
|
After 10 days of Ti treatment, 950 and 1277 were detected as up and downregulated genes, respectively, in the root. We also determine that 57 and 79 genes were up and downregulated, respectively, in shoots. Venn analysis helped us to determine sets of genes that are common or uniquely upregulated and downregulated at 10 days across the two tested tissues. In the case of upregulated genes, only six genes are shared, whereas 944 and 51 are specific for root and shoot, respectively ( Figure 4A). In the case of downregulated genes, 20 genes were shared, whereas 1257 and 59, were specific for root and shoot, respectively (Supplementary Figure S4, Supplementary Tables S3-S6). These **data** suggest that Ti treatment causes changes in the transcript level of a large number of genes in the roots, while it causes changes in transcript level in only a few genes in the shoot, and that the responses to Ti seem to be mainly organ specific. | Is data availability statement | no |
|
To further characterize the biological processes that are activated in response to Ti treatment, we manually analyzed DEGs induced at three days of Ti treatment and performed a Gene Ontology (GO) enrichment analysis of DEGs induced 10 days after the treatment with Ti. In the case three days after the treatment, we found that two of the four upregulated genes in roots of Ti-treated plants (CA1 and RBCS2B) are targets of OXIDATIVE STRESS 2 (OXS2). OXS2 is a tandem zinc finger transcription factor previously identified to play a crucial role in salt tolerance in Arabidopsis . The third gene, AT3G48200, encodes a transmembrane protein which knockout mutant is tolerant to heat stress and insensitive to abscisic acid (ABA), a phytohormone playing an important role in osmotic stress responses (Luhua et al., 2013). The fourth Ti-responsive gene at three days of Ti treatment, AT5G51585, is still uncharacterized. These **data** suggest that Ti treatment might initially induce only genes involved in tolerance to osmotic stress. | Is data availability statement | no |
|
We need to prepare for an experimental dataset D E and an observational dataset D O . For the experimental dataset, we directly use **data** from San Diego, which includes n (1) E = 6978 people in the treatment group and n (0) E = 1154 people in the control group. For the observational dataset, we consider inject synthetic confounding into the Riverside **data** (which originally include N 1 = 4405 people in the treatment group and N 0 = 1040 people in the control group). | Is data availability statement | no |
|
ICD codes were used to estimate age - cause specific mortality rates by sex , quintile of deprivation and Census year from five broad underlying causes of death : circulatory diseases , respiratory diseases , neoplasms , external causes and other causes . All deaths in our data included an ICD code entry ; however , deaths that were ill defined or did not refer to an identifiable ICD code were classified as ' other ' . Cause of death categories were mutually - exclusive , and harmonisation ensured comparability of causes over time . The ICD codes included in each category are available in appendix 5 . The absolute number of deaths and proportions of deaths in each cause - specific category are given below in Table 2 . | Is data availability statement | no |
|
CLM5 is the land component of a state-of-the-art earth system model Community Earth System Model 2 (Ref. 34 ). The CLM5 simulation was conducted at the spatial resolution of 0.5°from 1997 to 2010, driven by a revised climatology GSWP3 as the atmospheric forcing (http://hydro.iis.utokyo.ac.jp/GSWP3/), with the plant phenology prescribed from satellite products, the land cover of 2000, and the separated soil columns configuration 76,77 . The years 1997 to 2001 were the spinup period and excluded from the analysis (please see detailed description in ref. 75 ). In CLM, different types of vegetation within a grid cell are represented as separated tiles of different plant functional types (PFTs). We used subgrid PFTlevel model outputs to calculate sensible heat differences between different land cover types within the same model grid. The subgrid tiles within a model grid cell share the same atmospheric forcing, therefore replicating the assumption of similar meteorological conditions of the space-for-time approach 12 . To match the CLM5 model resolution, the dominant land cover types for forests and non-forest of each moving window were upscaled to 0.5°using the ESA land cover **data** ( Supplementary Fig. 9). Because CLM adopted a different land classification scheme, we created a look-up table to convert CLM land cover to the IGBP classification scheme (Supplementary Table 3). The differences in the sensible heat flux (ΔH) between a specific forest and a non-forest type can be extracted from the sensible heat values of the corresponding PFTs. | Is data availability statement | no |
|
Averaging methods determine baselines by averaging the consumption on past days that are similar (ex: in temperature or workday) to the event day. There are many variants such as weighted averaging and using an adjustment factor to account for variations between the event day and prior similar days. A detailed comparison of different averaging methods in offered in [8] [9] [10]. While averaging methods are attractive because of their simplicity, they suffer from estimation biases that can be substantial [10] [11]. Also, these methods require significant **data** access, especially for residential DR programs [12]. | Is data availability statement | no |
|
To evaluate the phenotypic relationships among accessions, the minimum distances between all pairs of individual accessions were constructed by plotting on a dendrogram ( Figure 2). To gain a better understanding of the overall diversity of nightshade accessions, the **data** were analyzed using cluster analysis, which revealed the similarities between the genotypes. SRetrflx, SABGA and Nshad9 had the lowest dissimilarity index (0.00) ( Table 5) | Is data availability statement | no |
|
In this section, the **data** set of joints and COP in AP from a typical subject is evaluated by the determinism test and stationarity test. Figures 2(a), 2(c), 2(e), and 2(g). This clearly confirms the deterministic nature of human balance system. The average cross-prediction error for all possible combinations of and is given in Figures 2(b), 2(d), 2(f), and 2(h). The average values of all are 0.1839, 0.2215, 0.1398, and 0.8951 (for the time series of hip, knee, ankle, and COP in AP, resp.). Since each maximal cross-prediction error is not significantly larger than the average, the studied time series are clearly stationary. | Is data availability statement | no |
|
For device-independent color reproduction, it is necessary to accurately characterize each individual color input and output device and transform image **data** into appropriate devicedependent versions based on the characterization. In order to isolate the end user from the nitty-gritty of handling color characterization information and transformations, color management systems have been proposed to automate this task. For a discussion of the systems issues in color management and existing color management schemes, the reader is referred to recent presentations on the topic [334]- [336]. A notable advance in this direction is the emergence of a widely accepted standard [86], [337] to facilitate the communication of device characterizations. | Is data availability statement | no |