input_data
stringlengths
151
6.14k
label
int32
0
32
label_level_1
int32
0
6
label_level_2
int32
0
4
Aiming the problem of parking information release is large flow, and release information is not timely and not accurately, the internet of vehicle (IOV) of the urban traffic parking system is proposed based on the parking lot sensor network and sensor combination and its electrical control circuits. This system will be use the core technology of internet of things (IOT) and combinate the timer and elastic pressure switch with a pressure sensor and its electrical circuits to realize the parking information acquisition, release, query, reservation and parking navigation with reliable and accurately. Experimental results shows the electrical control circuits will be better management the parking information with accurately and the car user uses the vehicle terminal to search the target parking lot and to receive the parking information. This system will be to alleviate the urban traffic and improve the utilization efficiency of urban parking lot, for the future of the city and social development to provide better technical support and basis.
6
1
2
Background: Rheumatoid arthritis (RA) and other rheumatic conditions not only fundamentally affect patients' quality of life and physiological needs but are also negatively associated with work ability. The costs of poor work ability, which, in sum, are more than treatment costs, pose an economic burden to society and patients. Work ability in RA appears to be multifactorial; symptoms such as pain, swelling, and stiffness play a major role, as these directly affect functional disability. Also, RA patients typically suffer from reduced muscle strength. Lower extremity function and grip strengths especially impair their quality of life. However, the role of muscle strength and disease activity as determinants of work ability have not yet been studied. Objective: The primary objective of this study is to compare work ability in working-age participants with seropositive RA and with high and low disease activity; the secondary objective is to evaluate the association of muscle strength, functional ability, and frailty with work ability. Methods: This monocentric cross-sectional study will be conducted at a rheumatologic outpatient clinic and day hospital with approximately 100 seropositive RA patients aged <65 years. A clinical disease activity index as a measure for rheumatoid disease activity will be assessed during the patients' routine visits at the clinic. Work ability, frailty, and functional disability will be evaluated with (self-reported) questionnaires as well as with physical tests (Work Ability Index/Score; Health Assessment Questionnaire Disability Index; Survey of Health, Ageing, and Retirement in Europe Frailty Instrument; Short Physical Performance Battery). Muscle strength will be determined with dynamometer measurements of isometric hand grip strength and quadriceps femoris muscle contraction strength. Sleep quality (Medical Outcomes Study Sleep Scale) and sexual functioning as physiological needs will additionally be determined with self-reported questionnaires. Results: For this study funding has already been awarded and enrollment has been completed. Data are currently being evaluated. Conclusions: This study will evaluate the association of work ability with modifiable parameters such as muscle strength and functional ability. It will provide further insights into work ability in RA and its associated risk factors. Any evidence of association will motivate further research, and the findings might encourage interventions focused specifically on improving muscle strength and lower extremity function to positively affect work ability.
26
5
3
The traditional prehospital management of trauma victims with potential spinal injury has become increasingly questioned as authors and clinicians have raised concerns about over-triage and harm. In order to address these concerns, the Norwegian National Competence Service for Traumatology commissioned a faculty to provide a national guideline for pre-hospital spinal stabilisation. This work is based on a systematic review of available literature and a standardised consensus process. The faculty recommends a selective approach to spinal stabilisation as well as the implementation of triaging tools based on clinical findings. A strategy of minimal handling should be observed.
26
5
3
The reductionist approach of dissecting biological systems into their constituents has been successful in the first stage of the molecular biology to elucidate the chemical basis of several biological processes. This knowledge helped biologists to understand the complexity of the biological systems evidencing that most biological functions do not arise from individual molecules; thus, realizing that the emergent properties of the biological systems cannot be explained or be predicted by investigating individual molecules without taking into consideration their relations. Thanks to the improvement of the current -omics technologies and the increasing understanding of the molecular relationships, even more studies are evaluating the biological systems through approaches based on graph theory. Genomic and proteomic data are often combined with protein-protein interaction (PPI) networks whose structure is routinely analyzed by algorithms and tools to characterize hubs/bottlenecks and topological, functional, and disease modules. On the other hand, co-expression networks represent a complementary procedure that give the opportunity to evaluate at system level including organisms that lack information on PPIs. Based on these premises, we introduce the reader to the PPI and to the co-expression networks, including aspects of reconstruction and analysis. In particular, the new idea to evaluate large-scale proteomic data by means of co-expression networks will be discussed presenting some examples of application. Their use to infer biological knowledge will be shown, and a special attention will be devoted to the topological and module analysis.
28
6
0
There is substantial evidence that non-B27 major histocompatibility complex (MHC) genes are associated with spondyloarthritis (SpA). Studies in Mexican and Tunisian populations demonstrated the association of SpA and human leukocyte antigen (HLA) B15. The purpose of this study was to evaluate the association of HLA-A, B, and DR antigens in a group of Colombian patients with a diagnosis of SpA. A total of 189 patients and 100 healthy subjects were included in the present study. All subjects underwent a complete characterization of HLA alleles A, B, and DR. Of the 189 studied patients, 35 were reactive arthritis (ReA), 87 were ankylosing spondylitis (AS), and 67 undifferentiated SpA (uSpA). According to the Assessment of Spondyloarthritis International Society (ASAS) criteria, 167 were axial SpA (axSpA) and 171 were peripheral SpA (pSpA). 63.8% were men, with a mean age of 35.9 +/- 12.7 years. 40.7% (77/189) of patients were HLA-B27 positive of which 52.9% had AS and 42.5% axSpA. 23.2% (44/189) of patients were HLA-B15 positive: 23.8% were uSpA, 12.57% were axSpA, and 11.7% were pSpA. In addition, HLA-DRB1*01 was associated with AS (58.6%) and axSpA (42.5%). Also, HLA-DRB1*04 was present in 62 patients with AS (71.2%) and in 26 with axSpA (15.5%). In this population, we found a strong association between the presence of HLA-B27 and the diagnosis of axSpA and AS, but the HLA-B15 is also significantly associated with all subtypes of the disease, predominantly with pSpA. Additionally, HLA-DR1 and DR4 were associated in a cohort of patients with SpA from Colombia.
26
5
3
Machine learning models for site of metabolism (SoM) prediction offer the ability to identify metabolic soft spots in low-molecular-weight drug molecules at low computational cost and enable data-based reactivity prediction. SoM prediction is an atom classification problem. Successful construction of machine learning models requires atom representations that capture the reactivity-determining features of a potential reaction site. We have developed a descriptor scheme that characterizes an atom's steric and electronic environment and its relative location in the molecular structure. The partial charge distributions were obtained from fast quantum mechanical calculations. We successfully trained machine learning classifiers on curated cytochrome P450 metabolism data. The models based on the new atom descriptors showed sustained accuracy for retrospective analyses of metabolism optimization campaigns and lead optimization projects from Bayer Pharmaceuticals. The results obtained demonstrate the practicality of quantum-chemistry-supported machine learning models for hit-to-lead optimization.
1
0
1
Waste materials generated from building demolition have become a great challenge to sustainable urban development due to its consumption of the limited landfill spaces, water pollution, energy consumption and harmful gas emissions. Proper management of demolition waste (DW) is a complex process and requires systematic thinking and analysis. Many methods have been proposed to study the environmental impact assessment of demolition waste management (DWM). However, it is found that the currently available studies pay little attention from the perspective of complex adaptive system (CAS) to consider the attitude and interaction of the heterogeneous stakeholders as well as the importance of green DWM which has a great influence on the effectiveness of DW management. The aim of this research is to simulate and explore how the change of attitude and the dynamic interaction among heterogeneous stakeholders can influence the environmental performance of DWM. To achieve this aim, a model for evaluating the environmental impact of DWM was developed by using an agent-based modeling (ABM) approach. The main factors considered in the model are the ratio of green deconstruction (i.e., building deconstruction) managers vs. conventional demolition (i.e. building destruction) managers, the ratio of green design managers (i.e. design for deconstruction) vs. conventional design managers, and the interaction behavior of heterogeneous stakeholders following the herd theory. In the model, the environmental impact assessment was quantified into four categories i.e. land resources, water resources, air resource and energy resources. The proposed model is demonstrated by using the data drawn from the Chinese construction industry. The results reveal that if the deconstruction method and the deconstruction oriented design are widely adopted by architects and engineers, the negative environmental impacts generated by DW can be reduced by at least 50%. Furthermore, the results provide valuable information for government departments to make decisions on how to improve environmental performance of DWM. (C) 2016 Elsevier Ltd. All rights reserved.
22
4
4
Beaver Slide is near 177.8km (110.5mi) on the Dalton Highway and the road gradient is approximately 11%, built on a hill side. Each year, soft spots, also commonly named as frost boils, were observed starting from late April and lasting for the entire summer. The frost boils have resulted in an extremely unsafe driving condition and frequent accident occurrences. Conventional repair methods cannot effectively solve this issue. A newly developed geotextile, which has high specific surface area, was installed in the selected test section to mitigate the frost-boil issue in August 2010. This type of geotextile can provide high wettability and relatively high suction (capillary force), consequently be able to laterally transport water (a high-directional transmissivity) under unsaturated conditions. Test results over the initial 2years had proved the effectiveness of the geotextile to alleviate frost heave and the subsequent thaw-weakening issue. However, there were still some concerns regarding its long-term performance, such as clogging of the microscopic drainage channels and mechanical failures. The data collected during the past 5 years were used to analyze and evaluate the effectiveness of the wicking fabric. A scanning electron microscope was used to explore the interaction between the wicking fabric and in situ soils, and to determine the condition of the fabric 5 years after the installation. (C) 2016 American Society of Civil Engineers.
19
4
1
Nitrogen (N) fertilizers are critical in today's agriculture especially in the United States. Leaching, methane and nitrous oxide emissions from N fertilizer use and the implications for global climate and water pollution raised serious concerns among environmentalists and agronomists. Prices of N fertilizers increased geometrically over the past few years. The projected increase in maize (Zea mays L.) ethanol production is expected to increase N fertilizer demand and prices. Hay prices are, however, staggering and producers are looking for alternatives to N fertilizers. This review paper assessed trends in fertilizer use and prices as well as factors affecting fixation and transfer of dinitrogen (N-2) in forage production systems. Additionally, economic implications of using N fertilizers and grass-legume mixtures are discussed. From the review, it was apparent that legumes have the potential to replace N fertilizers or at least complement the use of N fertilizers in forage production systems. However, N transfer in forage production systems is low. A lot more research is needed to answer the question Why legumes fix so much N-2 and transfer just a small proportion to other crops. Similarly, identifying compatible grass-legume species that enhance N-2 fixation will be a giant step towards reduced N fertilizer use and ensuring environmental sustainability. With the sharp increase in N fertilizer prices, the use of legumes in forage production systems seems promising.
22
4
4
Accurate estimates of biomass are required for relating ecosystem functioning to atmospheric carbon regulation. Biomass may be directly measured through field sampling, which can then be used to calibrate biomass predictions from remote sensing and/or modelling. Field sampling generally entails measuring the fresh mass of individual trees or shrubs and then estimating the moisture contents of a representative sub-samples, which are then used to calculate dry mass. Because any errors in the estimation of the moisture content (MC) correction are translated proportionally to the biomass prediction of an individual tree or shrub, care is required to ensure MC estimates are unbiased and as precise as possible. There are numerous different protocols currently applied to attain MC, with these differing in accuracy (bias and precision) and cost of implementation. A dataset of MC of above-ground biomass (AGB) of 1396 individuals (trees or shrubs) was used to assess which protocols for within- and among-individual sampling are likely to provide the most cost-effective estimates of MC within acceptable bounds of accuracy. Monte Carlo analysis was used to explore key sources of error in within-individual MC estimation. Results suggest these MC estimates may be based on at least the bole and crown components of AGB, with bias resulting if MC is based on stem wood only, particularly in young (or small) individuals. Little gain in accuracy was attained with more intensive sub-sampling (e.g. into foliage, twig, branches, bark, and stem wood components). Moreover, further efficiencies may be gained by applying existing empirical models to estimate the proportion of AGB that is crown based on easily measured variables such as stem diameter, thereby avoiding the resource-intensive process of partitioning to obtain fresh weights measurements of components. However to minimise bias, it is important to undertake MC sampling at each study site, and to stratify sampling among-individuals by both appropriate taxonomic grouping (e.g. plant functional type) and age-class. For a given plant functional type-by-size (or age) strata at a given site a precision of about 4% coefficient of variation of the average MC estimate can be achieved with intensive within- and among-individual sampling. However a precision of 8-10% is achievable using our recommended less intensive but more efficient protocol; derive an average MC for at least six individuals, and for each individual, intensively sub-sample bole and crown components for MC, which is then applied to the fresh weights of these components. This latter estimate may be obtained from partitioning of the AGB, or for the highest efficiency, from predictions obtained from the application of existing representative empirical relationships of partitioning based on the size of the individual. (C) 2017 Elsevier B.V. All rights reserved.
20
4
2
Maintenance and repair of the highway network system are major expenses in the state budget. For this reason various concerned organizations are pointing out the need for developing an intelligent and efficient pavement performance model that can prioritize pavement maintenance and rehabilitation works. Such models can forecast the remaining pavement service life and pavement rehabilitation needs, and can help in the formulation of pavement maintenance and strengthening programmes which will reduce the road agency and road user costs. The flexible pavement performance or deterioration models involve the complex interaction between vehicles and the environment, and the structure and surface of the pavement. Performance models relating to the pavement distress conditions like, cracking, raveling, potholing, and roughness are analyzed and developed by various researchers. But most of these models are found applicable to a particular set of traffic or environment conditions, thus highlighting the need of model(s) that can work in varied conditions satisfactorily. The paper presents a detailed review of various pavement performance models to examine the role of factors related to pavement materials, environmental conditions, type of traffic and volume of traffic, and to identify the limitations and gaps in the present knowledge on such models.
19
4
1
Background: Protein-protein interactions (PPIs) can offer compelling evidence for protein function, especially when viewed in the context of proteome-wide interactomes. Bacteria have been popular subjects of interactome studies: more than six different bacterial species have been the subjects of comprehensive interactome studies while several more have had substantial segments of their proteomes screened for interactions. The protein interactomes of several bacterial species have been completed, including several from prominent human pathogens. The availability of interactome data has brought challenges, as these large data sets are difficult to compare across species, limiting their usefulness for broad studies of microbial genetics and evolution. Results: In this study, we use more than 52,000 unique protein-protein interactions ( PPIs) across 349 different bacterial species and strains to determine their conservation across data sets and taxonomic groups. When proteins are collapsed into orthologous groups (OGs) the resulting meta-interactome still includes more than 43,000 interactions, about 14,000 of which involve proteins of unknown function. While conserved interactions provide support for protein function in their respective species data, we found only 429 PPIs (similar to 1% of the available data) conserved in two or more species, rendering any cross-species interactome comparison immediately useful. The meta-interactome serves as a model for predicting interactions, protein functions, and even full interactome sizes for species with limited to no experimentally observed PPI, including Bacillus subtilis and Salmonella enterica which are predicted to have up to 18,000 and 31,000 PPIs, respectively. Conclusions: In the course of this work, we have assembled cross-species interactome comparisons that will allow interactomics researchers to anticipate the structures of yet-unexplored microbial interactomes and to focus on well-conserved yet uncharacterized interactors for further study. Such conserved interactions should provide evidence for important but yet-uncharacterized aspects of bacterial physiology and may provide targets for anti-microbial therapies.
32
6
4
The paper presents a study leading to a new acute toxicity test on embryonic and juvenile organisms of the great pond snail (Lymnaea stagnalis Linnaeus). Sulfuric acid, nitric acid, and ammonium hydroxide were used as waterborne toxicants in laboratory experiments. The exposure time was 24 h. Tests were conducted in 5-10 replications for each toxicant. The toxicity of the substances was classified according to different scales and the test's sensitivity was compared to that of the commonly used bioindicator Daphnia magna Straus. The assessment of toxicity impact was supported by microscopic observations. The probit method was used as a parametric statistical procedure to estimate LC50 and the associated 95% confidence interval. Our study showed that the early developmental stages of Lymnaea stagnalis are very sensitive bioindicators, making it possible to detect even very low levels of the above-mentioned water toxicants. The highest toxicity is shown by ammonium hydroxide with LC50/24h values, respectively, 24.27 for embryos and 24.72 for juvenile forms, and the lowest is shown by nitric acid ions with LC50/24h values, respectively, 105.19 for embryos and 170.47 for juvenile forms. It is highly cost-effective due to simple and efficient breeding and the small size of the organisms in the bioassay population. Compared with Daphnia magna, relatively low concentrations of toxicants caused a lethal effect on embryonic and juvenile organisms of the great pond snail. Owing to their common occurrence and sensitivity, early developmental forms of Lymnaea stagnalis can be a valuable new tool in biomonitoring of the freshwater environment.
22
4
4
Persistent organic pollutants like organochlorine pesticides continue to contaminate large areas worldwide raising questions concerning their management. We designed and tested a method to link soil and water pollution in the watershed of the Galion River in Martinique. We first estimated the risk of soil contamination by chlordecone by referring to past use of land for banana cultivation and took 27 soil samples. We then sampled surface waters at 39 points and groundwater at 16 points. We tested three hypotheses linked to the source of chlordecone pollution at the watershed scale: (i) soils close to the river, (ii) soils close to the sampling point, (iii) throughout the sub-watershed generated at the sampling point. Graphical and statistical analysis showed that contamination of the river increased when it passed through an area with contaminated plots and decreased when it passed through area not contaminated by chlordecone. Modeling showed that the entire surface area of the watershed contributed to river pollution, suggesting that the river was mainly being contaminated by the aquifers and groundwater flows. Our method proved to be a reliable way to identify areas polluted by chlordecone at the watershed scale and should help stakeholders focus their management actions on both hot spots and the whole watershed. (C) 2016 Elsevier B.V. All rights reserved.
22
4
4
Machine Design is the most perspective field of study in the Faculty of Mechanical Engineering (FME) at the University of West Bohemia with regards to the demand for graduates in this field. These days, the main issue of higher education in the Czech Republic related to Machine Design is the lack of opportunities for students of Machine Design to acquire practical knowledge and experience in the field. The problem is that engineering companies offer work experience only to individual students as they try to hire them after graduation. But engineering companies are not motivated enough to contribute to the Machine Design teaching process by providing knowledge and experience to the majority of students. As this is not going to change soon, universities have to provide real applications to its students instead. Currently, there are not enough opportunities for students to acquire practical knowledge in the field during their studies. The required amount of work experience for engineering students in the Czech Republic is only one week, which is insufficient in comparison with the German system, where internship semesters are a mandatory part of studying at a Fachhochschule (University of Applied Sciences). It is necessary to provide practical information and real applications in class to compensate for the lack of practical experience. FME is concerned in this matter and it has launched the project for enhancement of the teaching process. A new concept of electronic learning materials for Machine Design was developed as a part of the project. The learning materials were developed by academics in cooperation with local engineering companies to provide students and academics at FME with real design projects reworked for learning purposes. The Machine Design teaching process is complicated due to the complexity of the mechanical design process, as it has its own particularities which need to be considered while creating learning materials. When describing the mechanical design process, not only the design process itself has to be described. Also the production processes, project schedules, economic aspects of the design project, legislative, technical standards and regulations have to be described as well. As these parts of the mechanical design process affect each other, it is important to describe them in the context of the mechanical design process to show the interconnection between them. The electronic form of learning materials was selected as it offers many advantages over the standard printed form. The main reason for selecting the electronic form of learning materials is the ability to work with Computer-aided engineering (CAE) models. Three-dimensional CAE models replaced drawing boards in every engineering company several years ago and thus it is important to include CAE models in the Machine Design teaching process. Thanks to the technology of exporting CAE models to the standard portable document format (PDF) it is possible to modify CAE models for learning purposes and use them as a learning tool. CAE models can be enhanced with a large amount of information regarding a design project such as manufacturing processes, materials or component functions and can be supplemented by technical drawings, illustrated project descriptions and further information. All the learning material is converted into a single PDF file so that every student can access it anywhere, anytime. Versatility and interactivity are the biggest benefits of the new electronic learning materials. The new electronic learning materials can be used when explaining new subject matter to show its application, or as a basis for students' assignments, individual work, teamwork or dissertations. The sample electronic learning materials are being produced and will be made available to students through the courseware of the University of West Bohemia. This paper intends to present the learning materials and describe their form, structure and content and their benefits for academics and students of Machine Design.
16
3
3
This study was intended to analyze the intersection of experience of sexual stigma low-socioeconomic status, and suicide attempt amongst young Brazilians (11-24 years old). In each of the data collection periods (2004-2006: n = 7185; 2010-2012: n = 2734), participants completed a questionnaire-based instrument. Network analysis provided support for a Minority Stress Model, oriented around whether participants had experienced sexual stigma. Although suicide attempts decreased by 20% for participants who had not experienced sexual stigma, there was a 60% increase for those who had experienced sexual stigma. Of particular note were the increases in rates of reported community and familial physical assault, molestation, and rape for those who had experienced sexual stigma. An analysis of centrality statistics demonstrated that both experiences of this Minority Stress Model were fundamentally different, and that those disparities increased over the time frame observed in this study. At the center of this model, shortest paths statistics exhibited a direct conditioned connection between experiencing sexual stigma and suicide attempts. We discuss the social and historical contexts that contributed to these dynamics, and emphasize the need for policy change.
8
2
0
Herein we report the synthesis and activity of an enzyme-directed immunostimulant with immune cell activation mediated by -galactosidase, either exogenously added, or on B16 melanoma cells. Covalent attachment of a -galactopyranoside to an imidazoquinoline immunostimulant at a position critical for activity resulted in a pro-immunostimulant that could be selectively converted by -galactosidase into an active immunostimulant. The pro-immunostimulant exhibited -galactosidase-directed immune cell activation as measured by NF-B transcription in RAW-Blue macrophages or cytokine production (TNF, IL-6, IL-12) in JAWSII monocytes. Conversion of the pro-immunostimulant into an active immunostimulant was also found to occur using -galactosidase-enriched B16 melanoma cells. In co-culture experiments with either immune cell line, -galactosidase-enriched B16 cells effected activation of bystander immune cells.
31
6
3
Recent years have witnessed a processor development trend that integrates central processing unit (CPU) and graphic processing unit (GPU) into a single chip. The integration helps to save some host-device data copying that a discrete GPU usually requires, but also introduces deep resource sharing and possible interference between CPU and GPU. This work investigates the performance implications of independently co-running CPU and GPU programs on these platforms. First, we perform a comprehensive measurement that covers a wide variety of factors, including processor architectures, operating systems, benchmarks, timing mechanisms, inputs, and power management schemes. These measurements reveal a number of surprising observations.We analyze these observations and produce a list of novel insights, including the important roles of operating system (OS) context switching and power management in determining the program performance, and the subtle effect of CPU-GPU data copying. Finally, we confirm those insights through case studies, and point out some promising directions to mitigate anomalous performance degradation on integrated heterogeneous processors.
4
0
4
Representational State Transfer (REST) web services has gained popular acceptance over the world-wide-web as a straightforward choice to the traditional or SOAP-based services. However, at present the REST-based service implementation does not have pre-defined security protection methods. In this paper, we present a defense mechanism against REST-based web service attacks called the REST-IDS, for a defense-in-depth network security in web service layer. REST-IDS is an intelligent mechanism that employs statistical approach to the stateof-the-art Text Mining-Based Anomaly Detection (TMAD) model to detect unknown novel vulnerabilities, which is sensitive to payload attacks.
2
0
2
The main aim of this study was to objectify the treatment assignment criteria used in a clinical centre for addiction treatment in Spain. A sample of 162 patients (87 inpatients and 75 outpatients) who sought treatment between 2010 and 2012 was assessed. Clinical characteristics (addiction severity, psychopathological symptoms, impulsiveness and maladjustment) of the two treatment groups (inpatient and outpatient) into which patients were assigned according to the clinical criteria of therapists were analysed to identify which variables were more relevant for patient placement. Moreover, the therapeutic progression of patients who met and did not meet the assignment criteria received was studied. According to the results, a score above 4 in the family/social support area of the European Addiction Severity Index (EuropASI), or, in cases of a score between 2 and 4 in the family/social area of EuropASI, a score above 2 in the partner subscale of the Maladjustment Scale correctly classified 73.5% of cases (96.6% of inpatients and 46.7% of outpatients). Comparisons of therapeutic results depending on matching or mismatching these assignment criteria showed a larger effect size in mismatching patient assignment criteria for outpatient treatment. The results obtained in this study provide an objective criterion for addicted patient placement. Moreover, from a cost-effective perspective, they question the necessity of inpatient treatment in most cases, demonstrating that outpatient treatment is a sufficient level of care. This study addresses the approach to assigning patients to the treatment modality that best fits them, implementing the least expensive level of care needed to achieve treatment success. (C) 2017 Elsevier Inc. All rights reserved.
23
5
0
The software support for simulation of electrical circuits has been developed for more than sixty years. Currently, the standard tools for simulation of analogous circuits are the simulators based on the open source package Simulation Program with Integrated Circuit Emphasis generally known as SPICE (Biolek 2003). There are many different applications that provide graphical interface and extended functionalities on the basis of SPICE or, at least, using SPICE models of electronic devices. The author of this paper performed a simulation of a circuit that acts as an electronic diode in Multisim and provides a comparison of the simulation results with the results obtained from measurements on the real circuit.
6
1
2
This paper presents a method of the determination of a minimal stable realisation of the fractional continuous-time linear system with different fractional orders. For the proposed method, a digraph-based algorithm was constructed. In this paper, we have shown how we can perform the transfer matrix using electrical circuits consisting of resistances, inductances, capacitances and source voltages. The proposed method was discussed and illustrated with some numerical examples.
6
1
2
Ranking items is an essential problem in recommendation systems. Since comparing two items is the simplest type of queries in order to measure the relevance of items, the problem of aggregating pairwise comparisons to obtain a global ranking has been widely studied. Furthermore, ranking with pairwise comparisons has recently received a lot of attention in crowdsourcing systems where binary comparative queries can be used effectively to make assessments faster for precise rankings. In order to learn a ranking based on a training set of queries and their labels obtained from annotators, machine learning algorithms are generally used to find the appropriate ranking model which describes the data set the best. In this paper, we propose a probabilistic model for learning multiple latent rankings by using pairwise comparisons. Our novel model can capture multiple hidden rankings underlying the pairwise comparisons. Based on the model, we develop an efficient inference algorithm to learn multiple latent rankings as well as an effective inference algorithm for active learning to update the model parameters in crowdsourcing systems whenever new pairwise comparisons are supplied. The performance study with synthetic and real-life data sets confirms the effectiveness of our model and inference algorithms.
1
0
1
In the design of nuclear power plants, various natural circulation passive cooling systems are considered to remove residual heat from the reactor core in the event of a power loss and maintain the plant's safety. These passive systems rely on gravity differences of fluids, resulting from density differentials, rather than using an external power-driven system. Unfortunately, a major drawback of such systems is their weak driving force, which can negatively impact safety. In such systems, there is a temperature difference between the heat source and the heat sink, which potentially offers a natural platform for thermoelectric generator (TEG) applications. While a previous study designed and analyzed a TEG-based passive core cooling system, this paper considers TEG applications in other passive cooling systems of nuclear power plants, after which the concept of a TEG-based passive cooling system is proposed. In such a system, electricity is produced using the system's temperature differences through the TEG, and this electricity is used to further enhance the cooling process.
5
1
0
Sediment transport is an important aspect of soil erosion, and sediment transport capacity (T-c) is a key to establishing process -based erosion models. A lot of studies exist that have determined T-c for overland flow, however, few studies have been conducted to determine T-c for loess sediments on steep slopes. Experimental data for this region are thus needed. The objectives of this study are to formulate new equations to describe T-c and evaluate the suitability of these equations for loess sediments on steep slopes. The slope gradients in this study ranged from 10.51% to 38.39%, and flow discharges per unit width varied from 1.11 x 10(-3) m(2) s(-1) to 3.78 x 10(-3) m(2) s(-1). Results showed that 11 increased as a power function with flow discharge and slope gradient, with R-2 = 0.99 and Nash -SuT(c)liffe model efficiency (NSE) = 0.99. T, was more sensitive to flow discharge than slope gradient. T-c increased as a power function with mean flow velocity, which was satisfied to predict T-c with R-2 = 0.99 and NSE = 0.99. Shear stress (R2 = 0.89, NSE = 0.88) was also a good predictor of T-c, and stream power (R-2 = 0.96, NSE = 0.96) was a better predictor of T-c than shear stress. However, unit stream power was not a good predictor to estimate T-c in our study, with R-2 = 0.63 and NSE = 0.62. These findings offer a new approach for predicting T-c for loess sediments on steep slopes. 2016 Elsevier B.V. All rights reserved.
14
3
1
This paper proposes an image encryption scheme based on Cellular Automata (CA). CA is a self-organizing structure with a set of cells in which each cell is updated by certain rules that are dependent on a limited number of neighboring cells. The major disadvantages of cellular automata in cryptography include limited number of reversal rules and inability to produce long sequences of states by these rules. In this paper, a non-uniform cellular automata framework is proposed to solve this problem. This proposed scheme consists of confusion and diffusion steps. In confusion step, the positions of the original image pixels are replaced by chaos mapping. Key image is created using non-uniform cellular automata and then the hyper-chaotic mapping is used to select random numbers from the image key for encryption. The main contribution of the paper is the application of hyper chaotic functions and non-uniform CA for robust key image generation. Security analysis and experimental results show that the proposed method has a very large key space and is resistive against noise and attacks. The correlation between adjacent pixels in the encrypted image is reduced and the amount of entropy is equal to 7.9991 which is very close to 8 which is ideal.
3
0
3
The present study presents a new experimental technique to measure friction angle between soil and Geosynthetic Clay Liner (GCL). The method in question avoids some deficiencies observed on the inclined plane and pullout tests. Moreover, the technique allows observing the GCL tensile behaviour. The experimental frame is easy to build in usual geotechnical laboratory. The one employed is made-up in civil engineering Department of Ouargla University (Algeria). It is usable for testing both GCLs and other geosynthetic materials. Also, it permits to apply various experimental conditions (like slide velocity, confining pressure and water content) to the tested materials. The present method highlights that the soil-GCL interaction is, actually, a combination of two loading forces: soil-GCL interface friction and pure traction of the GCL material. The obtained results allow evaluating both soil-GCL angle of friction and intrinsic stiffness of the GCL in relation with the confining pressure.
19
4
1
Like most other things, fracking has its good and bad points. In the former regard, it is a technological breakthrough that can increase the supplies of energy of the entire economy. In the latter, it has been linked with an increased incidence of earthquakes and water pollution, surely negatives. As well, there is some evidence fracking would not exist, at least not to the present extent, were it not for government subsidies, which, we argue, misallocate resources.
22
4
4
Background and aimsCognitive impairment has been associated with excessive alcohol use, but its neural basis is poorly understood. Chronic excessive alcohol use in adolescence may lead to neuronal loss and volumetric changes in the brain. Our objective was to compare the grey matter volumes of heavy- and light-drinking adolescents. DesignThis was a longitudinal study: heavy-drinking adolescents without an alcohol use disorder and their light-drinking controls were followed-up for 10 years using questionnaires at three time-points. Magnetic resonance imaging was conducted at the last time-point. SettingThe area near Kuopio University Hospital, Finland. ParticipantsThe 62 participants were aged 22-28years and included 35 alcohol users and 27 controls who had been followed-up for approximately 10 years. MeasurementsAlcohol use was measured by the Alcohol Use Disorders Identification Test (AUDIT)-C at three time-points during 10years. Participants were selected based on their AUDIT-C score. Magnetic resonance imaging was conducted at the last time-point. Grey matter volume was determined and compared between heavy- and light-drinking groups using voxel-based morphometry on three-dimensional T1-weighted magnetic resonance images using predefined regions of interest and a threshold of P<0.05, with small volume correction applied on cluster level. FindingsGrey matter volumes were significantly smaller among heavy-drinking participants in the bilateral anterior cingulate cortex, right orbitofrontal and frontopolar cortex, right superior temporal gyrus and right insular cortex compared to the control group (P<0.05, family-wise error-corrected cluster level). ConclusionsExcessive alcohol use during adolescence appears to be associated with an abnormal development of the brain grey matter. Moreover, the structural changes detected in the insula of alcohol users may reflect a reduced sensitivity to alcohol's negative subjective effects.
23
5
0
Smart environments possess devices that collaborate to help the user non-intrusively. One possible aid smart environment offer is to anticipate user's tasks and perform them on his/her behalf or facilitate the action completion. In this paper, we propose a framework that predicts user's actions by learning his/her behavior when interacting with the smart environment. We prepare the datasets and train a predictor that is responsible to decide whether a target transducer value should be changed or not. Our solution achieves a significant improvement for all target transducers studied and most combinations of parameters yields better results than the base case.
18
4
0
It is hard to estimate optical flow given a realworld video sequence with camera shake and other motion blur. In this paper, we first investigate the blur parameterisation for video footage using near linear motion elements. We then combine a commercial 3D pose sensor with an RGB camera, in order to film video footage of interest together with the camera motion. We illustrate that this additional camera motion/trajectory channel can be embedded into a hybrid framework by interleaving an iterative blind deconvolution and warping based optical flow scheme. Our method yields improved accuracy within three other state-of-the-art baselines given our proposed ground truth blurry sequences; and several other realworld sequences filmed by our imaging system.
0
0
0
The discrete-time robust disturbance attenuation problem for the n-degrees of freedom (dof) mechanical systems with uncertain energy function is considered in this paper. First, it is shown in the continuous time-setting that the robust control problem of n-dof mechanical systems can be reduced to a disturbance attenuation problem when a specific type of control rule is used. Afterwards, the robust disturbance attenuation problem is formulised as a special disturbance attenuation problem. Then, the discrete-time counterpart of this problem characterised by means of L-2 gain is given. Finally, a solution of the problem via direct-discrete-time design is presented as a sufficient condition. The proposed discrete-time design utilizes discrete gradient of the energy function of considered system. Therefore, a new method is also proposed using the quadratic approximation lemma to construct discrete gradients for general energy functions. The proposed direct-discrete-time design method is used to solve the robust disturbance attenuation problem for the double pendulum system. Simulation results are given for the discrete gradient obtained with the method presented in this paper. Note that the solution presented here for the robust disturbance attenuation problem give an explicit algebraic condition on the design parameter, whereas solution of the same problem requires solving a Hamilton-Jacobi-Isaacs partial differential inequality in general nonlinear systems.
7
1
4
This research addresses three important issues regarding interpersonal expectancy effects and communication across various modalities. The phenomena of behavioral confirmation and disconfirmation were tested in an original experiment involving 148 participants using computer-mediated communication (CMC). First, this study tested a boundary condition asserted by previous theorists about whether or not confirmation and disconfirmation could occur in communication channels without nonverbal communication. Secondly, it shed light on an important causal variable of perceived malleability of interpersonal expectancies in a novel, simultaneous test of confirmation and disconfirmation. Lastly, it verified the hyperpersonal model of CMC by demonstrating behavioral confirmation, and extended the model by specifying when disconfirmation occurs online.
11
2
3
Conserving water resources and protecting them from the pollution are of high importance in natural cycle of our life. Nitrate, as one of the important sources of water pollution, is a serious threat to aquatic ecosystems, and due to its high solubility, its extraction from the water is a costly process. Providing a reliable, low cost and fast method is necessary for eliminating pollution. This study tried to determine the refining potential and capacity of Eichhornia crassipes for removing nitrate from the water. Factors such as initial concentration of nitrate, contact time, absorbent mass, pH and the presence of other competing ions such as sulfate on nitrate absorption have been studied. The results showed that the best efficiency of nitrate removal, more than 99%, in the optimum condition (the retention time of 30 hours, absorbent dose three plants (15 stems) and pH = 6.4) occurred. In addition, the efficiency of nitrate removal in the presence of sulfate ions did not reduce. By increasing the initial nitrate concentration, from 30 to 150 mg/L, there was no significant change in removal efficiency. Actually, by increasing absorbent mass, removal time increased from 67.96% to 100%. The process of nitrate absorption followed Langmuir isotherm (R-2 = 1). However, the results showed that Eichhornia crassipes, a promising plant with great functionality, can be used as a refiner for removing nitrate and it is a simple, efficient and low cost method.
22
4
4
In this article, the load-settlement characteristics of unreinforced and reinforced two-layered soil during the loading process are investigated. A series of bearing ratio tests was performed on a granular soil as the base layer overlaying a cohesive soil as the subgrade layer. Three reinforcing conditions (unreinforced, reinforced with nonwoven geotextile, and reinforced with geogrid) at the interface of layers, with four compaction moisture contents (CMCs) of the subgrade layer and three thicknesses of the base layer for both soaked and non-soaked conditions are considered. The results show that the CMC of the subgrade layer has a significant effect on the behavior of two-layered soil, such as swelling amount and the efficiency of the reinforcements. Reinforcing with geogrid resulted in a considerable increase in strength of the soaked samples due to adhesion between geogrids and clayey subgrade layer. For nonwoven geotextiles, strength of the two-layered soil decreased at shallow penetration depths due to reinforcements; and as the penetration increased in depth, the strength also increased. Also, it was found that with decreases in base layer thickness, the test variable's value (i.e., CMC), and the type of geosynthetic reinforcement have significant effects on the behavior of two-layered soil.
19
4
1
The prevalence of antisocial behavior in school settings is still discouraging. Students that often engage in aggressive acts may lack in the ability to appreciate the emotional consequences of their behaviors and share others' emotions. The Children's Empathic Attitudes Questionnaire (CEAQ) is one of the questionnaires used to assess empathy in children and early adolescents. This study is aimed to validate the Spanish version of CEAQ. The sample comprised 297 children (50% males), aged from 7 to 12 years (M = 9.53, SD = 1.2), from Madrid. Confirmatory factor analysis indicated an excellent fit for a unidimensional model, chi(2)(89) = 110.702, p = .059; CFI = .972; RMSEA = .029, 95% CI [.000,.045]. Multigroup invariance analysis showed no significant gender-related differences in all levels. Results also referred an acceptable reliability (omega = .824, r = .610). These results provide psychometric support for the use of the Spanish version of CEAQ as a valid and reliable instrument to assess empathy in children and youth population, especially for school-based interventions.
12
2
4
Background: Sex chromosome aneuploidies occur in approximately one in 420 live births. The most frequent abnormalities are 45, X (Turner syndrome), 47, XXX (triple X), 47, XXY (Klinefelter syndrome), and 47, XYY. The prevalence of males with more than one extra sex chromosome (e.g. 48, XXYY or 48, XXXY) is less common. However, the literature provides little information about the cognitive and behavioural phenotype and the natural history of the disease. We report the clinical, neurocognitive, social cognitive and psychiatric characterization of a patient with 49, XYYYY syndrome. Case presentation: The patient presented with a complex phenotype including a particular cognitive profile with intellectual deficiency and autism spectrum disorder (ASD) with limited interests. Moreover, social anxiety disorder with selective mutism and separation anxiety disorder were observed (DSM-5 criteria, MINI Assessment). Conclusion: It is now admitted that 49, XYYYY has unique medical, neurodevelopmental and behavioural characteristics. Interestingly, ASD is more common in groups with Y chromosome aneuploidy. This clinical report suggests that understanding the cognitive and social functioning of these patients may provide new insights into possible therapeutic strategies, as cognitive remediation or social cognitive training.
9
2
1
The purpose of this study was to analyze the swing characteristics of the kicking leg in order to elucidate the technical mechanisms of the drive curve shot (topspin curve shot). The orientation of the ankle joint at the point of ball impact closely resembles the orientation during a general curve shot or inside shot. Moreover, for the ankle joint movement at the point of ball impact, there is less mediolateral movement (y direction) compared to a curve shot, and more movement in the vertical direction (z direction). Based on the fact that the angle of the vertical rotational axis at that point is larger than that in other shots, this is likely to be one of the factors in generating topspin on the ball. Therefore, we believe that some of the most important and fundamental characteristics of drive curve shots are a smaller angle of attack and a larger vertical movement of the ankle joint at the point of impact, as compared to normal curve shots. (C) 2013 The Authors. Published by Elsevier Ltd. Selection and peer-review under responsibility of the School of Aerospace, Mechanical and Manufacturing Engineering, RMIT University
15
3
2
Social cognition is fundamentally interpersonal: individuals' behaviour and dispositions critically affect their interaction partners' information processing. However, cognitive neuroscience studies, partially because of methodological constraints, have remained largely perceiver-centric: focusing on the abilities, motivations, and goals of social perceivers while largely ignoring interpersonal effects. Here, we address this knowledge gap by examining the neural bases of perceiving emotionally expressive and inexpressive social targets. Sixteen perceivers were scanned using fMRI while they watched targets discussing emotional autobiographical events. Perceivers continuously rated each target's emotional state or eye-gaze direction. The effects of targets' emotional expressivity on perceiver's brain activity depended on task set: when perceivers explicity attended to targets' emotions, expressivity predicted activity in neural structures-including medial prefrontal and posterior cingulate cortex-associated with drawing inferences about mental states. When perceivers instead attended to targets' eye-gaze, target expressivity predicted activity in regions-including somatosensory cortex, fusiform gyrus, and motor cortex-associated with monitoring sensorimotor states and biological motion. These findings suggest that expressive targets affect information processing in manner that depends on perceivers' goals. More broadly, these data provide an early step toward understanding the neural bases of interpersonal social cognition.
10
2
2
Mobile devices including popular smartphone contributes to efficiency improvement of on-site data processing. Mobile environment for real-time data processing needs some additional aspects besides desktop ones, as the style of mobile app, mobile web and mobile web app. Generally, mobile app provides internal storage service so that offline data processing is possible. But mobile app needs separate development according to different types of devices or operating systems, even to different sizes of display panel of mobile devices. Geo-based data are composed of vector and raster formats from complex structure, compared to other data sets including image data and multimedia data. Hence, even though mobile applications development for geo-based data is more complicated than other ones, users' demands with respect to geo-based data processing functionalities on mobile environment are increasing in these days. Mobile web supported by technical basis of HyperText Markup Language5 (HTML5) is regarded as useful service type combining geo-based data processing modules and mobile environments, because it does not require user' downloading or installation of programs and just needs web browsers. Indexed DB Application Programming Interface (API) within this international web standard provides offline data storage functions on mobile environment. Using this API, data sets can be permanently stored into mobile devices, not cache memory. Among numerous geo-data functionalities, visualization topic is basically and commonly used in most mobile services. This study presents an implementation case of mobile web app with geobased data visualization processing on online and offline mode. Types of geo-based data sets are base map of Open Street Map (OSM), OSM vector layers with Extensible Markup Language (XML) contents and high resolution satellite images of optical sensor. It is thought that the result of this implementation can play a role to create intelligent mobile application fields using both geo-based data sets and earth observation satellite image sets.
4
0
4
The temperature recovering curve derived by cold stimulation experiment reveals human inner metabolism status. The changes of the curve involve many heat-influence factors in which metabolic heat-production is the most important. This paper adopts Tikhonove regularization method to eliminate the measured infrared image data errors and constructs a numerical model based on Independent Component Analysis (ICA) to compute two main heat components heat transfer quantity of blood perfusion and heat-production quantity of metabolism. Different component results are obtained in two typical recovering curves. And the metabolic heat-production quantity comparisons between the diabetic and the healthy show that the metabolic function of the healthy are much better than that of the diabetic numerically. This provides a novel method to estimate the human metabolism quantitatively.
30
6
2
Classroom observation is an important part of language teacher education (Kelly & Grenfell 2004) but its effects could be enhanced through observation and guided analysis of video-recorded lessons. As a Matter of fact, focus on teacher talk and on its specific conversational patterns (Sinclair 1982, Sinclair & Brazil 1982) could be of great benefit on teacher education. Moreover, digital data (audio, video and text) presenting natural speech in context would be a relevant tool for teacher trainers, in order to help their trainees to develop teaching awareness and interaction ability, especially if such data are easily and freely accessible and properly treated through new methods of computer-based multimodal analysis. In this study we have analysed, through a multimodal approach, teacher talk questioning in several L2 and LS Italian classrooms. Questioning is one of the most common techniques used by teachers (Richards, 1996) and serves as the principal way in which teachers control the classroom interaction. In some classrooms over half of class time is taken up by question-and-answer exchanges (Richards, 1996). We have focused on two main types of questions: display questions and referential questions. Through the usage of some video-recorded lessons, transcribed and subtitled, we have investigated the presence and relevance of non-linguistic patterns, that match with regularity with these linguistic phenomena. Particularly, we have examined specific non-verbal and para-verbal activities made by the speakers, tightly linked to the various types of questions, and we have noted that there are recurring behaviours used together with the over mentioned linguistic structures, to express specific communicative and didactic scopes and functions, that is to say to build and spread knowledge in a L2/LS Italian classroom.
11
2
3
Volvariella volvacea is difficult to store fresh because of the lack of low-temperature resistance. Many traditional mutagenic strategies have been applied in order to select out strains resistant to low temperature, but few commercially efficient strains have been produced. In order to break through the bottleneck of traditional breeding and significantly improve low-temperature resistance of the edible fungus V. volvacea, strains resistant to low temperature were constructed by genome shuffling. The optimum conditions of V. volvacea strain mutation, protoplast regeneration, and fusion were determined. After protoplasts were treated with 1% (v/v) ethylmethylsulfonate (EMS), 40Sec of ultraviolet (UV) irradiation, 600 Gy electron beam implantation, and 750 Gy(60)Co- irradiation, separately, the lethality was within 70%-80%, which favored generating protoplasts being used in following forward mutation. Under these conditions, 16 strains of V. volvacea mutated by EMS, electron beam, UV irradiation, and Co-60- irradiation were obtained. The 16 mutated protoplasts were selected to serve as the shuffling pool based on their excellent low-temperature resistance. After four rounds of genome shuffling and low-temperature resistance testing, three strains (VF1, VF2, and VF3) with high genetic stability were screened. VF1, VF2, and VF3 significantly enhanced fruit body shelf life to 20, 28, and 28H at 10 degrees C, respectively, which exceeded 25%, 75%, and 75%, respectively, compared with the storage time of V23, the most low-temperature-resistant strain. Genome shuffling greatly improved the low-temperature resistance of V. volvacea, and shortened the course of screening required to generate desirable strains. To our knowledge, this is the first paper to apply genome shuffling to breeding new varieties of mushroom, and offers a new approach for breeding edible fungi with optimized phenotype. (C) 2015 International Union of Biochemistry and Molecular Biology, Inc.
28
6
0
EPIC (Earth Polychromatic Imaging Camera) is a 10-channel spectroradiometer onboard DSCOVR (Deep Space Climate Observatory) spacecraft. In addition to the near-infrared (NIR, 780 nm) and the 'red' (680 nm),channels, EPIC also has the 02 A-band (764 +/- 0.2 nm) and B-band (687.75 +/- 0.2 nm). The EPIC Normalized Difference Vegetation Index (NDVI) is defined as the difference between NIR and 'red' channels normalized to their sum. However, the use of the O2 B-band instead of the 'red' channel mitigates the effect of atmosphere on remote sensing of surface reflectance because O2 reduces contribution from the radiation scattered by the atmosphere. Applying the radiative transfer theory and the spectral invariant approximation to EPIC observations, the paper provides supportive arguments for using the O2 band instead of the red channel for monitoring vegetation dynamics. Our results suggest that the use of the O2 B-band enhances the sensitivity of the top-of-atmosphere NDVI to the presence of vegetation. Published by Elsevier Ltd.
20
4
2
The traditional power control schemes for induction heating device mainly focus on the pulse frequency modulation (PFM) and the pulse density modulation. But they cannot solve the problems of power control, efficiency, and load-adaption well. This paper presents and analyzes the asymmetrical frequency modulation (AFM) control scheme used in the full-bridge series resonant inverter. With the proposed AFM control technique, the output power is controlled by two variables: the operation frequency and the division factor. Better efficiency performance can be achieved in the medium and low output power range when compared with PFM. The principles as well as the zero-voltage switching condition of the AFM are explained and the power losses of switches are analyzed. A control algorithm that schedules the three control modes of AFM is experimentally verified with a digital signal processor based induction heating prototype. The load-adaption, noise and thermal distribution problem of switches are also analyzed.
7
1
4
Major basketball competitions are characterized by complex and rapid attack and defense phases, mainly involving higher execution speed in performing technical procedures as well as individual and collective tactical actions. This research aims at presenting to basketball coaches the results we obtained by applying the PROFILE OF NONVERBAL SENSITIVITY (PONS) nonverbal communication test (N. AMBADY 1980). We consider that the optimization of this type of communication can improve game relations among players, as well as between athletes and coaches, in implementing game tactics and in achieving outstanding performance. (C) 2013 The Authors. Published by Elsevier Ltd.
10
2
3
Background: Food allergen labeling is an important tool to reduce risk of exposure and prevent anaphylaxis for individuals with food allergies. Health Canada released a Canadian food allergen labeling regulation (2008) and subsequent update (2012) suggesting that research is needed to guide further iterations of the regulation to improve food allergen labeling and reduce risk of exposure. Objective: The primary objective of this study was to examine consumer preferences in food labeling for allergy avoidance and anaphylaxis prevention. A secondary objective was to identify whether different subgroups within the consumer population emerged. Methods: A discrete choice experiment using a fractional factorial design divided into ten different versions with 18 choice-sets per version was developed to examine consumer preferences for different attributes of food labeling. Results: Three distinct subgroups of Canadian consumers with different allergen considerations and food allergen labeling needs were identified. Overall, preferences for standardized precautionary and safety symbols at little or no increased cost emerged. Conclusion: While three distinct groups with different preferences were identified, in general the results revealed that the current Canadian food allergen labeling regulation can be improved by enforcing the use of standardized precautionary and safety symbols and educating the public on the use of these symbols.
24
5
1
Quality control of laser additive manufactured medical implants is of interest, especially if nondestructive quality control can be performed on parts before implantation. X-ray micro-computed tomography (microCT or CT) can be used for defect/porosity analysis as well as for comparing the part surface with its computer-aided design (CAD) file. In both cases, the limited use of CT is partly due to the variation in scan types and the quality of scans that can occur. We present a simple method demonstrating the use of a light metal casting as a reference porosity sample, to confirm good CT image quality and to quantify minimum detectable pore size for the selected CT scan settings. This makes a good comparison for additive manufactured parts, since castings generally contain more porosity. A full part-to-CAD comparison shows how the part is compared with its CAD file, as a second-quality control. The accuracy of the CAD variance is given by the minimum detectable pore size. Finally, the part is sectioned and scanned at two higher resolution settings showing small porosity (10-50 mu m diameter) present but well distributed, as expected.
13
3
0
Objective: A previous study conducted by our group found theory of mind (ToM) differences in preschool children who sustained mild traumatic brain injury (mTBI) compared with typically developing peers, 6 months postinjury. The goals of the current longitudinal study were to determine whether these findings are the result of a brain-injury-specific effect or rather a general-injury effect, to examine the long-term evolution of ToM skills following preschool mTBI, as well as to investigate the links between ToM abilities and general social functioning. Method: Seventy-two children who sustained mTBI between the ages of 18 and 60 months were evaluated 6 and 18 months postinjury on ToM tasks including desires and emotions reasoning and false belief understanding. They were compared with 58 participants who sustained an orthopedic injury (OI) and 83 typically developing children (TDC). Results: The 3 groups did not differ on demographic and baseline characteristics. The mTBI group obtained poorer scores relative to both comparison groups on the desires and emotions reasoning task, both at 6 and 18 months injury. No correlations were found between injury characteristics and ToM performance. For the mTBI group, associations were found between ToM performance and global social competence. Conclusion: These findings suggest a brain-injury-specific effect that persists in the long-term following mTBI in preschool children.
9
2
1
Chemistry was already pioneered by ancient Egyptians up to 4000 years ago. Despite its age, chemistry is by no means a dying scientific discipline. The different branches of study within chemistry have infiltrated all fields of life science research, and nobody can work in this area without using chemistry in the broadest sense. The present article is a personal view on how chemistry supports life science research, in particular in the field of nutrition and metabolism research. It provides insight into how chemistry, in close collaboration with life science research, helps to fill the gaps between our current fragmentary understanding and the comprehensive knowledge required for better understanding the molecular details of metabolism, health and disease, and aging. The most important contributions of the chemical disciplines to these studies with respect to a systems biological description of human nutrition and metabolism will be outlined.
30
6
2
The goal of the study was to determine if people's endorsement of different moral foundations influences their degree of prosocial behavior in a set of economic exchange games. Moral Foundations Theory has proven to be a useful means of categorizing ideas about morality and predicting opinions on aspects of social justice, political orientation, and other constructs related to prosocial behavior. This study sought to determine if Progressivism, the degree to which individuals endorse the individualizing moral foundations (i.e., Harm/Care and Fairness/Reciprocity) over the binding moral foundations (i.e., In-group/Loyalty, Authority/Respect, and Purity/Sanctity), would lead to more frequent cooperation in the Prisoner's Dilemma, a higher level of investment in the Trust Game, a higher level of return of one's partner's investment in the Trust Game, and fewer points stolen in the Thieves' Game. The results indicated no relationship between Progressivism and performance in the Thieves' Game. In three separate linear regressions controlling for age, gender, race, and Big-5 personality traits Progressivism was associated with more frequent cooperation in the Prisoner's Dilemma, a higher level of investment in the Trust Game, and a higher level of return of one's partner's investment in the Trust Game. Therefore it does appear that moral foundations do predict performance in economic exchange games and that,a greater endorsement of Progressivism is associated with more prosocial behavior. (C) 2017 Elsevier Ltd. All rights reserved.
12
2
4
To evade the well-known impossibility of unconditionally secure quantum two-party computations, previous quantum private comparison protocols have to adopt a third party. Here, we study how far we can go with two parties only. We propose a very feasible and efficient protocol. Intriguingly, although the average amount of information leaked cannot be made arbitrarily small, we find that this average will not exceed 14 bits for any length of the bit-string being compared.
3
0
3
Two principal types of human in vivo studies with non-pharmaceuticals can be distinguished: (1) human metabolism studies are used for identification of target metabolites which can subsequently be used in biological monitoring studies. Furthermore, they allow extrapolation from excretion of metabolite(s) to exposure to the parent compound on the basis of an understanding of human pharmacokinetics. (?) Pharmacodynamic or effect studies are restricted to the study of acute and inherently reversible changes and are most likely to improve risk assessment in the following areas: neurobehavioural effects (e.g, alcohol, organic solvents), alterations in biochemical markers (e.g. cholinesterase inhibition following organophosphate exposure) and topical effects (e.g, skin irritancy). Ethical considerations al e of prime importance and, as a minimum, any human study must comply with the principles of the Declaration of Helsinki. The protocol should include scientifically sound objectives, a justification of subject numbers. a formal risk-benefit analysis and provisions for appropriate ethical review. The welfare of the individual participating in the study must be paramount. Informed consent has to be obtained and subjects must be free to withdraw from the study at any time. Compensation should be given for the inconvenience of participating in the study. but never for undergoing risk. Provided these conditions are met, human volunteer studies can be a powerful tool in risk assessment and risk management of exposure to non-pharmaceutical products. (C) 2001 Elsevier Science Ireland Ltd. All rights reserved.
30
6
2
The emergence of proteomics has led to major technological advances in mass spectrometry (MS). These advancements not only benefitted MS-based high-throughput proteomics but also increased the impact of mass spectrometry on the field of structural and molecular biology. Here, we review how state-of-the-art MS methods, including native MS, top-down protein sequencing, cross-linking-MS, and hydrogen-deuterium exchange-MS, nowadays enable the characterization of biomolecular structures, functions, and interactions. In particular, we focus on the role of mass spectrometry in integrated structural and molecular biology investigations of biological macromolecular complexes and cellular machineries, highlighting work on CRISPR-Cas systems and eukaryotic transcription complexes.
28
6
0
Background: Previous genomewide association studies (GWASs) have identified a number of putative risk loci for alcohol dependence (AD). However, only a few loci have replicated and these replicated variants only explain a small proportion of AD risk. Using an innovative approach, the goal of this study was to generate hypotheses about potentially causal variants for AD that can be explored further through functional studies. Methods: We employed targeted capture of 71 candidate loci and flanking regions followed by next-generation deep sequencing (mean coverage 78X) in 806 European Americans. Regions included in our targeted capture library were genes identified through published GWAS of alcohol, all human alcohol and aldehyde dehydrogenases, reward system genes including dopaminergic and opioid receptors, prioritized candidate genes based on previous associations, and genes involved in the absorption, distribution, metabolism, and excretion of drugs. We performed single-locus tests to determine if any single variant was associated with AD symptom count. Sets of variants that overlapped with biologically meaningful annotations were tested for association in aggregate. Results: No single, common variant was significantly associated with AD in our study. We did, however, find evidence for association with several variant sets. Two variant sets were significant at the q-value <0.10 level: a genic enhancer for ADHFE1 (p=1.47x10(-5); q=0.019), an alcohol dehydrogenase, and ADORA1 (p=5.29x10(-5); q=0.035), an adenosine receptor that belongs to a G-protein-coupled receptor gene family. Conclusions: To our knowledge, this is the first sequencing study of AD to examine variants in entire genes, including flanking and regulatory regions. We found that in addition to protein coding variant sets, regulatory variant sets may play a role in AD. From these findings, we have generated initial functional hypotheses about how these sets may influence AD.
32
6
4
This paper presents a new approach to infer worldwide malware-infected machines by solely analyzing their generated probing activities. In contrary to other adopted methods, the proposed approach does not rely on symptoms of infection to detect compromised machines. This allows the inference of malware infection at very early stages of contamination. The approach aims at detecting whether the machines are infected or not as well as pinpointing the exact malware type/family. The latter insights allow network security operators of diverse organizations, Internet service providers and backbone networks to promptly detect their clients' compromised machines in addition to effectively providing them with tailored anti-malware/patch solutions. To achieve the intended goals, the proposed approach exploits the darknet Internet space and initially filters out misconfiguration traffic targeting such space using a probabilistic model. Subsequently, the approach employs statistical methods to infer large-scale probing activities as perceived by the dark space. Consequently, such activities are correlated with malware samples by leveraging fuzzy hashing and entropy based techniques. The proposed approach is empirically evaluated using a recent 60 GB of real darknet traffic and 65 thousand real malware samples. The results concur that the rationale of exploiting probing activities for worldwide early malware infection detection is indeed very promising. Further, the results, which were validated using publically available data resources, demonstrate that the extracted inferences exhibit noteworthy accuracy and can generate significant cyber security insights that could be used for effective mitigation. (C) 2015 Elsevier B.V. All rights reserved.
2
0
2
Research on intergroup contact has grown exponentially over the past decade. Such research has typically extolled the benefits of positive interaction between members of historically divided communities, particularly on outcomes related to prejudice reduction. Emerging work in the field, however, has qualified this optimistic picture by identifying three gaps in the existing literature. First, in everyday life, contact may be construed as a negative experience that increases rather than decreases responses such as prejudice, anxiety, and avoidance. Second, in real-life settings, contact is often circumscribed by informal practices of (re)segregation that are easily overlooked if researchers rely primarily on examining structured contact and explicit processes using primarily laboratory and questionnaire methods. Third, positive contact may have ironic effects on the political attitudes and behaviors of the historically disadvantaged, undermining their recognition of social injustice and decreasing their willingness to engage in collective action to challenge the status quo. Although it is now a truism that intergroup contact can reduce intergroup prejudice, these developments emphasize the importance of maintaining a critical perspective on the contact hypothesis as a model for promoting social change in historically divided and unequal societies. They also lay the foundations for future developments in the field.
8
2
0
Chronic health conditions of the elderly lead to limitations in physical activity with disability, anxiety, and increased need for medical care and assisted living conditions. Physical performance tests are used to screen for pending loss of mobility and can serve as endpoints to monitor the effectiveness of intervention measures. Since limited mobility is associated with the physical and mental health of a person, evaluation of this in preclinical aging studies in mice will provide a translational approach for testing new intervention strategies. We assessed physiological parameters in 4, 12, 20 and 28 month old C57BL/6 and CB6F1 male mice using a rotating rod, a free running wheel, and a photo beam activity field, designed to determine changes in coordinated walking ability, self-motivated running distance, and anxiety response to a novel environment, respectively. Older mice showed decreased coordinated walking times and decreased running distances, predictive of physical performance ability and motivation in the elderly. Changes in both lateral and vertical movements were observed in a novel cage environment suggesting different levels of anxiety. Because the genetic background of the two mouse strains influenced test results in an age-dependent manner, it is imperative to recognize that diverse genetic backgrounds in mice may yield different data in preclinical studies and would need to be interpreted individually for translational applications. (C) 2017 Elsevier Inc. All rights reserved.
27
5
4
This research examined how affective and cognitive responses to culture fusion, a specific type of culture mixing that features the blending of different cultures or parts thereof into a new entity, are influenced by individual differences in Need for Closure (NFC). Two studies showed that individuals high (vs. low) in NFC felt less favorable toward culture fusion (i.e., the affective response), both at an abstract level (i.e., society structure models; Study 1, N = 191) and at a more concrete level (i.e., food stimuli; Study 2, N = 257). In addition, high NFC individuals tended to assign culturally fused stimuli to one discrete culture, rather than acknowledging them as culturally hybrid (i.e., the cognitive response). Furthermore, mediation analyses showed that the relationships between NFC and responses toward culture fusion were mediated by Right-Wing Authoritarianism. These findings are interpreted in terms of the threat to epistemic security needs posed by culture fusion.
8
2
0
With the advances of stem cell research, development of intelligent biomaterials and three-dimensional biofabrication strategies, highly mimicked tissue or organs can be engineered. Among all the biofabrication approaches, bioprinting based on inkjet printing technology has the promises to deliver and create biomimicked tissue with high throughput, digital control, and the capacity of single cell manipulation. Therefore, this enabling technology has great potential in regenerative medicine and translational applications. The most current advances in organ and tissue bioprinting based on the thermal inkjet printing technology are described in this review, including vasculature, muscle, cartilage, and bone. In addition, the benign side effect of bioprinting to the printed mammalian cells can be utilized for gene or drug delivery, which can be achieved conveniently during precise cell placement for tissue construction. With layer-by-layer assembly, three-dimensional tissues with complex structures can be printed using converted medical images. Therefore, bioprinting based on thermal inkjet is so far the most optimal solution to engineer vascular system to the thick and complex tissues. Collectively, bioprinting has great potential and broad applications in tissue engineering and regenerative medicine. The future advances of bioprinting include the integration of different printing mechanisms to engineer biphasic or triphasic tissues with optimized scaffolds and further understanding of stem cell biology.
7
1
4
Social pressure exerted by urban development, the increase in erosion on many coastal stretches, and the rise in sea level due to climate change over the last few decades have led governments to increase investment in coastal protection. In turn, a reduction in costs and increases in ease of construction and rate of implementation have led to sand-filled geotextile elements, such as bags, tubes, and containers, becoming an alternative or supplement to traditional coastal defence materials, such as rubble mounds, concrete, and so on. Not all coastal zones are appropriate for sand-filled geotextile structures as coastal defences. This article analyses suitable zones for locating geotextile bag revetments to protect coasts from storm erosion and concludes that the least suitable zones are the surf zone (on an open coast and on a slightly protected coast) and deep water (on an open coast), except if a suitable reinforcement is carried out when the demand makes it necessary this build this kind of defence.
19
4
1
This study focuses on four textile industries (DH-GEDA, NOYA, ALMHADI, and ALSAR) established between 2005 and 2008 in the peri-urban areas of Dukem and Gelan. The objectives of the study were to generate baseline information regarding the concentration levels of selected pollutants and to analyze their effects on biophysical environments. This study also attempts to explore the level of exposure that humans and livestock have to polluted effluents and the effects thereof. The findings of this study are based on data empirically collected from two sources: laboratory analysis of sample effluents from the four selected textile plants and quantitative as well as qualitative socioeconomic data collection. As part of the latter, a household survey and focus group discussions (FGDs) with elderly and other focal persons were employed in the towns of Dukem and Gelan. The results of the study show that large concentrations of biological oxygen demand (BOD5), chemical oxygen demand (COD), total suspended solids (TSS), and pH were found in all the observed textile industries, at levels beyond the permissible discharge limit set by the national Environmental Protection Authority (EPA). Furthermore, sulfide (S-2), R-phosphate (R-PO43), and Zn were found in large concentrations in DH-GEDA and ALMHADI, while high concentrations were also identified in samples taken from ALSAR and ALMHADI. In spite of the clear-cut legal tools, this study shows that the local environment, people, and their livestock are exposed to highly contaminated effluents. We therefore recommend that the respective federal and regional government bodies should reexamine the compliance to and actual implementation of the existing legal procedures and regulations and respond appropriately.
22
4
4
Dense motion field estimation is a key computer vision problem. Many solutions have been proposed to compute small or large displacements, narrow or wide baseline stereo disparity, or non-rigid surface registration, but a unified methodology is still lacking. The authors introduce a general framework that robustly combines direct and feature-based matching. The feature-based cost is built around a novel robust distance function that handles keypoints and weak features such as segments. It allows us to use putative feature matches to guide dense motion estimation out of local minima. The authors' framework uses a robust direct data term. It is implemented with a powerful second-order regularisation with external and self-occlusion reasoning. Their framework achieves state-of-the-art performance in several cases (standard optical flow benchmarks, wide-baseline stereo and non-rigid surface registration). Their framework has a modular design that customises to specific application needs.
0
0
0
Background: The Generation Scotland: Scottish Family Health Study (GS: SFHS) is a family-based population cohort with DNA, biological samples, socio-demographic, psychological and clinical data from approximately 24,000 adult volunteers across Scotland. Although data collection was cross-sectional, GS: SFHS became a prospective cohort due to of the ability to link to routine Electronic Health Record (EHR) data. Over 20,000 participants were selected for genotyping using a large genome-wide array. Methods: GS: SFHS was analysed using genome-wide association studies (GWAS) to test the effects of a large spectrum of variants, imputed using the Haplotype Research Consortium (HRC) dataset, on medically relevant traits measured directly or obtained from EHRs. The HRC dataset is the largest available haplotype reference panel for imputation of variants in populations of European ancestry and allows investigation of variants with low minor allele frequencies within the entire GS: SFHS genotyped cohort. Results: Genome-wide associations were run on 20,032 individuals using both genotyped and HRC imputed data. We present results for a range of well-studied quantitative traits obtained from clinic visits and for serum urate measures obtained from data linkage to EHRs collected by the Scottish National Health Service. Results replicated known associations and additionally reveal novel findings, mainly with rare variants, validating the use of the HRC imputation panel. For example, we identified two new associations with fasting glucose at variants near to Y_RNA and WDR4 and four new associations with heart rate at SNPs within CSMD1 and ASPH, upstream of HTR1F and between PROKR2 and GPCPD1. All were driven by rare variants (minor allele frequencies in the range of 0.08-1%). Proof of principle for use of EHRs was verification of the highly significant association of urate levels with the well-established urate transporter SLC2A9. Conclusions: GS: SFHS provides genetic data on over 20,000 participants alongside a range of phenotypes as well as linkage to National Health Service laboratory and clinical records. We have shown that the combination of deeper genotype imputation and extended phenotype availability make GS: SFHS an attractive resource to carry out association studies to gain insight into the genetic architecture of complex traits.
32
6
4
Purpose The application of organic and inorganic fertilizers to soil can result in increased gaseous emissions, such as NH3, N2O, CO2, and CH4, as well as nitrate leaching, contributing to climate warming and ground and surface water pollution, particularly in regions with hot climates, where high temperatures and high soil nitrification rates often occur. The use of nitrification inhibitors (NIs) has been shown to effectively decrease nitrogen (N) losses from the soil-plant system. Materials and methods Non-disruptive laboratory incubation experiments were conducted to assess the extent to which temperature (20 and 30 degrees C) and nutrient source (mineral and organic fertilizers) influence the rate of carbon (C)- and N-related microbial processes in soil in response to the NI 3,4-dimethylpyrazole phosphate (DMPP). Furthermore, short-term changes in the ability of microbes to degrade C substrates were evaluated in disruptive soil microcosms using microbial community-level physiological profiling and the abundance of the bacterial 16S rRNA gene as a measure of total bacterial population size. Results and discussion DMPP reduced net nitrification after 2 and 4 weeks of incubation at 30 and 20 degrees C by an average of 78.3 and 84.5 %, respectively, and with similar dynamics for mineral or organic fertilization. The addition of labile organic matter with cattle effluent led to a rapid increase in C mineralization that was significantly reduced by DMPP at both temperatures, whereas no changes could be detected after the addition of mineral fertilizer. The culturable heterotrophic microorganisms showed metabolic diversification in the oxidation of C sources, with organic fertilizer playing a major role in the substrate utilization patterns during the first week of incubation and the DMPP effects prevailing from day 14 until day 28. Furthermore, the copy number of the bacterial 16S rRNA gene was reduced by the application of DMPP and organic fertilizer after 28 days. Conclusions Our results show the marked efficiency of DMPP as an NI at elevated temperatures of incubation and when associated with both mineral and organic fertilization, providing support for its use as a tool to mitigate N losses in Mediterranean ecosystems. However, we also observed impaired C respiration rates and bacterial abundances, as well as shifts in communitylevel physiological profiles in soil, possibly indicating a short-term effect of DMPP and organic fertilizers on non-target Crelated processes and microorganisms.
22
4
4
The pathophysiological changes associated with Alzheimer's Disease (AD) begin decades before the emergence of clinical symptoms. Understanding the early mechanisms associated with AD pathology is, therefore, especially important for identifying disease-modifying therapeutic targets. While the majority of AD clinical trials to date have focused on anti-amyloid-beta (A beta) treatments, other therapeutic approaches may be necessary. The ability to monitor changes in cellular networks that include both A beta and non-A beta pathways is essential to advance our understanding of the etiopathogenesis of AD and subsequent development of cognitive symptoms and dementia. Metabolomics is a powerful tool that detects perturbations in the metabolome, a pool of metabolites that reflects changes downstream of genomic, transcriptomic and proteomic fluctuations, and represents an accurate biochemical profile of the organism in health and disease. The application of metabolomics could help to identify biomarkers for early AD diagnosis, to discover novel therapeutic targets, and to monitor therapeutic response and disease progression. Moreover, given the considerable parallel between mouse and human metabolism, the use of metabolomics provides ready translation of animal research into human studies for accelerated drug design. In this review, we will summarize current progress in the application of metabolomics in both animal models and in humans to further understanding of the mechanisms involved in AD pathogenesis. This article is part of a Special Issue entitled: Misfolded Proteins, Mitochondrial Dysfunction, and Neurodegenerative Diseases. (C) 2013 Elsevier B.V. All rights reserved.
30
6
2
During the last decades photogrammetric computer vision systems have been well established in scientific and commercial applications. Recent developments in image-based 3D reconstruction systems have resulted in an easy way of creating realistic, visually appealing and accurate 3D models. We present a fully automated processing pipeline for metric and geo-accurate 3D reconstructions of complex geometries supported by an online feedback method for user guidance during image acquisition. Our approach is suited for seamlessly matching and integrating images with different scales, from different view points (aerial and terrestrial), and with different cameras into one single reconstruction. We evaluate our approach based on different datasets for applications in mining, archaeology and urban environments and thus demonstrate the flexibility and high accuracy of our approach. Our evaluation includes accuracy related analyses investigating camera self-calibration, georegistration and camera network configuration. (C) 2016 Elsevier Inc. All rights reserved.
0
0
0