title
stringlengths
8
300
abstract
stringlengths
0
10k
Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters
Recently, Rao-Blackwellized particle filters (RBPF) have been introduced as an effective means to solve the simultaneous localization and mapping problem. This approach uses a particle filter in which each particle carries an individual map of the environment. Accordingly, a key question is how to reduce the number of particles. In this paper, we present adaptive techniques for reducing this number in a RBPF for learning grid maps. We propose an approach to compute an accurate proposal distribution, taking into account not only the movement of the robot, but also the most recent observation. This drastically decreases the uncertainty about the robot's pose in the prediction step of the filter. Furthermore, we present an approach to selectively carry out resampling operations, which seriously reduces the problem of particle depletion. Experimental results carried out with real mobile robots in large-scale indoor, as well as outdoor, environments illustrate the advantages of our methods over previous approaches
Effect of superficial collagen patterns and fibrillation of femoral articular cartilage on knee joint mechanics-a 3D finite element analysis.
Collagen fibrils of articular cartilage have specific depth-dependent orientations and the fibrils bend in the cartilage surface to exhibit split-lines. Fibrillation of superficial collagen takes place in osteoarthritis. We aimed to investigate the effect of superficial collagen fibril patterns and collagen fibrillation of cartilage on stresses and strains within a knee joint. A 3D finite element model of a knee joint with cartilage and menisci was constructed based on magnetic resonance imaging. The fibril-reinforced poroviscoelastic material properties with depth-dependent collagen orientations and split-line patterns were included in the model. The effects of joint loading on stresses and strains in cartilage with various split-line patterns and medial collagen fibrillation were simulated under axial impact loading of 1000 N. In the model, the collagen fibrils resisted strains along the split-line directions. This increased also stresses along the split-lines. On the contrary, contact and pore pressures were not affected by split-line patterns. Simulated medial osteoarthritis increased tissue strains in both medial and lateral femoral condyles, and contact and pore pressures in the lateral femoral condyle. This study highlights the importance of the collagen fibril organization, especially that indicated by split-line patterns, for the weight-bearing properties of articular cartilage. Osteoarthritic changes of cartilage in the medial femoral condyle created a possible failure point in the lateral femoral condyle. This study provides further evidence on the importance of the collagen fibril organization for the optimal function of articular cartilage.
Virtual Worlds and Augmented Reality in Cultural Heritage Applications
Mixed Realities (Milgram & Kishino 1994) and their concept of cyber-real space interplay invoke such interactive digital narratives that promote new patterns of understanding. However, the "narrative” part, which refers to a set of events happening during a certain period of time and providing aesthetic, dramaturgical and emotional elements, objects and attitudes (Nandi & Marichal 2000, Tamura et al 2001) is still an early topic of research. Mixing such aesthetic ambiences with virtual character augmentations (Cavazza et al 2003) and adding dramatic tension has developed very recently these narrative patterns into an exciting new edutainment medium (Lindt 2003). Since recently, AR Systems had various difficulties to manage such a time-travel in a fully interactive manner, due to hardware & software complexities in AR ‘Enabling Technologies’ (Azuma et al 2001). Generally the setup of such systems was only operational in specific places (indoors-outdoors) or with specific objects which were used for training purposes rendering them not easily applicable in different sites. Furthermore, almost none of these systems feature full real-time virtual human simulation. With our approach, based on an efficient real-time tracking system, which require only a small pre-recorded sequence as a database, we can setup the AR experience with animated virtual humans anywhere, quickly. With the interplay of a modern real-time framework for integrated interactive virtual character simulation, we can enhance the experience with full virtual character simulations. Even if the environmental conditions are drastically altered, thus causing problems for the real-time camera tracker, we can re-train the camera tracker to allow it to continue its operation. The proposed set of algorithms and methodologies aim to extend the “AR Enabling Technologies” in order to further support real-time, mobile, dramaturgical and behavioured Mixed Reality simulations, as opposed to static annotations or rigid geometrical objects. Fig. 1 depicts fully simulated virtual humans (skin, clothes, face, body) augmenting a cultural heritage site.
Sensorimotor synchronization: a review of recent research (2006-2012).
Sensorimotor synchronization (SMS) is the coordination of rhythmic movement with an external rhythm, ranging from finger tapping in time with a metronome to musical ensemble performance. An earlier review (Repp, 2005) covered tapping studies; two additional reviews (Repp, 2006a, b) focused on music performance and on rate limits of SMS, respectively. The present article supplements and extends these earlier reviews by surveying more recent research in what appears to be a burgeoning field. The article comprises four parts, dealing with (1) conventional tapping studies, (2) other forms of moving in synchrony with external rhythms (including dance and nonhuman animals' synchronization abilities), (3) interpersonal synchronization (including musical ensemble performance), and (4) the neuroscience of SMS. It is evident that much new knowledge about SMS has been acquired in the last 7 years.
System supporting money laundering detection
Criminal analysis is a complex process involving information gathered from di erent sources, mainly of quantitative character, such as billings or bank account transactions, but also of qualitative character such as eyewitnesses' testimonies. Due to the massive nature of this information, operational or investigation activities can be vastly improved when supported by dedicated techniques and tools. The system supporting the police analyst in the process of detecting money laundering is presented in this paper. The main parts of the system are data importer and analyzing algorithms, such as transaction mining algorithm and frequent pattern mining algorithms. The results obtained with the use of these algorithms can be visualized, so that they can be easily explored by the police analyst. The transactions found can be treated as suspected operations. The frequent patterns found are mainly used to identify the roles of suspected entities. For the transaction mining algorithm a performance study is also presented.
RADICALIZATION AS A REACTION TO FAILURE : AN ECONOMIC MODEL OF ISLAMIC EXTREMISM
This paper views Islamist radicals as self-interested political revolutionaries and builds on a general model of political extremism developed in a previous paper (Ferrero, 2002), where extremism is modelled as a production factor whose effect on expected revenue is initially positive and then turns negative, and whose level is optimally chosen by a revolutionary organization. The organization is bound by a free-access constraint and hence uses the degree of extremism as a means of indirectly controlling its level of membership with the aim of maximizing expected per capita income of its members, like a producer co-operative. The gist of the argument is that radicalization may be an optimal reaction to perceived failure (a widespread perception in the Muslim world) when political activists are, at the margin, relatively strongly averse to effort but not so averse to extremism, a configuration that is at odds with secular, Western-style revolutionary politics but seems to capture well the essence of Islamic revolutionary politics, embedded as it is in a doctrinal framework.
Enriching Wayfinding Instructions with Local Landmarks
Navigation services communicate optimal routes to users by providing sequences of instructions for these routes. Each single instruction guides the wayfinder from one decision point to the next. The instructions are based on geometric data from the street network, which is typically the only dataset available. This paper addresses the question of enriching such wayfinding instructions with local landmarks. We propose measures to formally specify the landmark saliency of a feature. Values for these measures are subject to hypothesis tests in order to define and extract landmarks from datasets. The extracted landmarks are then integrated in the wayfinding instructions. A concrete example from the city of Vienna demonstrates the applicability and usefulness of the method.
Convex Principal Feature Selection
A popular approach for dimensionality reduction and data analysis is principal component analysis (PCA). A limiting factor with PCA is that it does not inform us on which of the original features are important. There is a recent interest in sparse PCA (SPCA). By applying an L1 regularizer to PCA, a sparse transformation is achieved. However, true feature selection may not be achieved as non-sparse coefficients may be distributed over several features. Feature selection is an NP-hard combinatorial optimization problem. This paper relaxes and re-formulates the feature selection problem as a convex continuous optimization problem that minimizes a mean-squared-reconstruction error (a criterion optimized by PCA) and considers feature redundancy into account (an important property in PCA and feature selection). We call this new method Convex Principal Feature Selection (CPFS). Experiments show that CPFS performed better than SPCA in selecting features that maximize variance or minimize the mean-squaredreconstruction error.
Graph Models for Knowledge Representation and Reasoning for Contemporary and Emerging Needs – A survey
Reasoning is the fundamental capability which requires knowledge. Various graph models have proven to be very valuable in knowledge representation and reasoning. Recently, explosive data generation and accumulation capabilities have paved way for Big Data and Data Intensive Systems. Knowledge Representation and Reasoning with large and growing data is extremely challenging but crucial for businesses to predict trends and support decision making. Any contemporary, reasonably complex knowledge based system will have to consider this onslaught of data, to use appropriate and sufficient reasoning for semantic processing of information by machines. This paper surveys graph based knowledge representation and reasoning, various graph models such as Conceptual Graphs, Concept Graphs, Semantic Networks, Inference Graphs and Causal Bayesian Networks used for representation and reasoning, common and recent research uses of these graph models, typically in Big Data environment, and the near future needs and challenges for graph based KRR in computing systems. Observations are presented in a table, highlighting suitability of the surveyed graph models for contemporary scenarios.
Dominated Colorings of Graphs
In this paper, we introduce and study a new coloring problem of a graph called the dominated coloring. A dominated coloring of a graph G is a proper vertex coloring of G such that each color class is dominated by at least one vertex of G. The minimum number of colors among all dominated colorings is called the dominated chromatic number, denoted by χdom(G). In this paper, we establish the close relationship between the dominated chromatic number χdom(G) and the total domination number γt(G); and the equivalence for triangle-free graphs. We study the complexity of the problem by proving its NP-completeness for arbitrary graphs having χdom(G) ≥ 4 and by giving a polynomial time algorithm for recognizing graphs having χdom(G) ≤ 3. We also give some bounds for planar and star-free graphs and exact values for split graphs.
Skin Aging and Photoaging Alter Fatty Acids Composition, Including 11,14,17-eicosatrienoic Acid, in the Epidermis of Human Skin
We investigated the alterations of major fatty acid components in epidermis by natural aging and photoaging processes, and by acute ultraviolet (UV) irradiation in human skin. Interestingly, we found that 11,14,17-eicosatrienoic acid (ETA), which is one of the omega-3 polyunsaturated acids, was significantly increased in photoaged human epidermis in vivo and also in the acutely UV-irradiated human skin in vivo, while it was significantly decreased in intrinsically aged human epidermis. The increased ETA content in the epidermis of photoaged human skin and acute UV-irradiated human skin is associated with enhanced expression of human elongase 1 and calcium-independent phosphodiesterase A(2). We demonstrated that ETA inhibited matrix metalloproteinase (MMP)-1 expression after UV-irradiation, and that inhibition of ETA synthesis using EPTC and NA-TCA, which are elongase inhibitors, increased MMP-1 expression. Therefore, our results suggest that the UV increases the ETA levels, which may have a photoprotective effect in the human skin.
Brahmastra: Driving Apps to Test the Security of Third-Party Components
We present an app automation tool called Brahmastra for helping app stores and security researchers to test thirdparty components in mobile apps at runtime. The main challenge is that call sites that invoke third-party code may be deeply embedded in the app, beyond the reach of traditional GUI testing tools. Our approach uses static analysis to construct a page transition graph and discover execution paths to invoke third-party code. We then perform binary rewriting to “jump start” the third-party code by following the execution path, efficiently pruning out undesired executions. Compared with the state-of-theart GUI testing tools, Brahmastra is able to successfully analyse third-party code in 2.7×more apps and decrease test duration by a factor of 7. We use Brahmastra to uncover interesting results for two use cases: 175 out of 220 children’s apps we tested display ads that point to web pages that attempt to collect personal information, which is a potential violation of the Children’s Online Privacy Protection Act (COPPA); and 13 of the 200 apps with the Facebook SDK that we tested are vulnerable to a known access token attack.
AN ANALYSIS FOR CREDIT RATING AND MOMENTUM STRATEGY
In recent years, investors are increasingly concerned about whether the credit risk will affect the return on investment. This paper discusses the credit rating and momentum investment strategy relationship. The research period is from January 2005 to December 2010, and the sample is the ordinary shares of companies listed on Taiwan Stock Exchange (TSE). By calculating the cumulative returns of the investment portfolio of the holding period, and grouping the research samples by credit rating, this paper tests the relationship between credit rating and momentum investment strategy in Taiwan’s stock market. Second, in the exploration of the factors affecting credit rating and stock returns, this paper uses variables including firm size, financial leverage, turnover rate, company age and industry to analyze the impact of factors including information asymmetry and industry on the investment strategy. Moreover, this paper probes into the impact of January Effect and business cycle on credit rating. The empirical results reveal that Taiwan’s stock market does not have the momentum effect, although there is reverse investment strategy. In other words, the returns of stocks of investment portfolio of better credit rating are higher than those of poorer credit rating; and the results are reverse if the reverse investment strategy is applied. The empirical results are not affected by adding variables such as firm size, financial leverage, turnover rate, company age and industry. Hence, momentum investment strategy of Taiwan’s stock market is not affected by credit rating. © 2015 AESS Publications. All Rights Reserved.
Interactive Narrative Personalization with Deep Reinforcement Learning
Data-driven techniques for interactive narrative generation are the subject of growing interest. Reinforcement learning (RL) offers significant potential for devising data-driven interactive narrative generators that tailor players’ story experiences by inducing policies from player interaction logs. A key open question in RL-based interactive narrative generation is how to model complex player interaction patterns to learn effective policies. In this paper we present a deep RL-based interactive narrative generation framework that leverages synthetic data produced by a bipartite simulated player model. Specifically, the framework involves training a set of Q-networks to control adaptable narrative event sequences with long short-term memory network-based simulated players. We investigate the deep RL framework’s performance with an educational interactive narrative, CRYSTAL ISLAND. Results suggest that the deep RL-based narrative generation framework yields effective personalized interactive narratives.
Fracture toughness and fatigue crack growth characteristics of nanotwinned copper
Recent studies have shown that nanotwinned copper (NT Cu) exhibits a combination of high strength and moderate ductility. However, most engineering and structural applications would also require materials to have superior fracture toughness and prolonged subcritical fatigue crack growth life. The current study investigates the effect of twin density on the crack initiation toughness and stable fatigue crack propagation characteristics of NT Cu. Specifically, we examine the effects of tailored density of nanotwins, incorporated into a fixed grain size of ultrafine-grained (UFG) copper with an average grain size of 450 nm, on the onset and progression of subcritical fracture under quasi-static and cyclic loading at room temperature. We show here that processing-induced, initially coherent nanoscale twins in UFG copper lead to a noticeable improvement in damage tolerance under conditions of plane stress. This work strongly suggests that an increase in twin density, at a fixed grain size, is beneficial not only for desirable combinations of strength and ductility but also for enhancing damage tolerance characteristics such as fracture toughness, threshold stress intensity factor range for fatigue fracture and subcritical fatigue crack growth life. Possible mechanistic origins of these trends are discussed, along with issues and challenges in the study of damage tolerance in NT Cu. 2011 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Facebook in context(s): Measuring emotional responses across time and space
This article advances a contextual approach to understanding the emotional and social outcomes of Facebook use. In doing so, we address the ambiguity of previously reported relationships between Facebook use and well-being. We test temporal (shorter vs longer time spans) and spatial (at home vs away from home) dimensions of Facebook activity using an innovative approach. By triggering smartphone surveys in response to users’ naturalistic Facebook posting, we captured the immediate context of both mobile and desktop activities during daily life. Findings indicated positive—yet fleeting—emotional experiences up to 10 minutes after active posting and higher arousal for 30 minutes following posting at home. Nonetheless, Facebook activities predicted no changes in aggregate mood over 2 weeks, despite showing positive relationships to bridging social capital during the same period. Our results call attention to fleeting experiences (vs Corresponding author: Joseph Bayer, School of Communication, The Ohio State University, 154 N. Oval Mall, Columbus, OH MI 43210, USA. Email: Bayer.66@osu.edu 681522 NMS0010.1177/1461444816681522new media & societyBayer et al. research-article2017
Where to publish and find ontologies? A survey of ontology libraries
One of the key promises of the Semantic Web is its potential to enable and facilitate data interoperability. The ability of data providers and application developers to share and reuse ontologies is a critical component of this data interoperability: if different applications and data sources use the same set of well defined terms for describing their domain and data, it will be much easier for them to "talk" to one another. Ontology libraries are the systems that collect ontologies from different sources and facilitate the tasks of finding, exploring, and using these ontologies. Thus ontology libraries can serve as a link in enabling diverse users and applications to discover, evaluate, use, and publish ontologies. In this paper, we provide a survey of the growing-and surprisingly diverse-landscape of ontology libraries. We highlight how the varying scope and intended use of the libraries a ects their features, content, and potential exploitation in applications. From reviewing eleven ontology libraries, we identify a core set of questions that ontology practitioners and users should consider in choosing an ontology library for finding ontologies or publishing their own. We also discuss the research challenges that emerge from this survey, for the developers of ontology libraries to address.
Prevention of relapse/recurrence in major depression by mindfulness-based cognitive therapy.
This study evaluated mindfulness-based cognitive therapy (MBCT), a group intervention designed to train recovered recurrently depressed patients to disengage from dysphoria-activated depressogenic thinking that may mediate relapse/recurrence. Recovered recurrently depressed patients (n = 145) were randomized to continue with treatment as usual or, in addition, to receive MBCT. Relapse/recurrence to major depression was assessed over a 60-week study period. For patients with 3 or more previous episodes of depression (77% of the sample), MBCT significantly reduced risk of relapse/recurrence. For patients with only 2 previous episodes, MBCT did not reduce relapse/recurrence. MBCT offers a promising cost-efficient psychological approach to preventing relapse/recurrence in recovered recurrently depressed patients.
Phase changes during hygroscopic cycles of mixed organic/inorganic model systems of tropospheric aerosols.
A correct description of the aerosol's phases is required to determine its gas/particle partitioning, its reactivity and its water uptake and release. In this study, we investigate organic/electrolyte interactions of ammonium sulfate, nitrate and sodium chloride with substances containing carboxylic acids (COOH) and hydroxyl (OH) functional groups. As organic model compounds, we chose polyols with different OH/CHn (n = 0-3) ratios-namely, glycerol, 1,4-butanediol, and 1,2-hexanediol-as well as PEG 400 and a mixture of dicarboxylic acids consisting of malic, malonic, maleic, glutaric, and methylsuccinic acid. Bulk solubility and water activity measurements of these model systems together with a survey of literature data showed that NaCl is a salting-out agent for alcohols and organic acids whereas ammonium nitrate and sulfate exhibited salting-in and salting-out tendencies depending on the nature and number of functional groups as well as on the concentration of the solution. All investigated salts induce a liquid-liquid phase separation in the 1,2-hexanediol/water system. Considering the composition of the tropospheric aerosol, such phase separations might indeed occur frequently when particles in the atmosphere are exposed to varying relative humidity. To complement the bulk experiments, we investigated single particles consisting of ammonium sulfate and dicarboxylic acids as well as of ammonium sulfate and PEG 400 in an electrodynamic balance. Whereas the relative humidities of total deliquescence as well as the water uptake and release of the fully deliquesced particles are in good agreement with the bulk results and represent thermodynamic equilibrium, the water uptake before full deliquescence shows significant deviations. These deviations may be caused by morphological effects.
Challenges in a future IP/Ethernet-based in-car network for real-time applications
In current vehicles, a large number of control units are connected by several automotive specific communication buses, facilitating innovative distributed applications. At the same time, computers and entertainment devices use IP and commodity communications technology like Ethernet to connect to the Internet, allowing for innovative solutions and maintaining fast innovation cycles. Today, one can see first applications of Ethernet for in-vehicle communication in contemporary cars. In next generation vehicles, many innovative applications could benefit from the increased bandwidth Ethernet can offer. Therefore, a examination of Ethernet usage for additional in-vehicle communication use cases is needed. In this paper, we show simulation results of promising use cases for in-car Ethernet, while looking at different realistic topologies, types of traffic, and configurations.
DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs
In this work we address the task of semantic image segmentation with Deep Learning and make three main contributions that are experimentally shown to have substantial practical merit. First, we highlight convolution with upsampled filters, or ‘atrous convolution’, as a powerful tool in dense prediction tasks. Atrous convolution allows us to explicitly control the resolution at which feature responses are computed within Deep Convolutional Neural Networks. It also allows us to effectively enlarge the field of view of filters to incorporate larger context without increasing the number of parameters or the amount of computation. Second, we propose atrous spatial pyramid pooling (ASPP) to robustly segment objects at multiple scales. ASPP probes an incoming convolutional feature layer with filters at multiple sampling rates and effective fields-of-views, thus capturing objects as well as image context at multiple scales. Third, we improve the localization of object boundaries by combining methods from DCNNs and probabilistic graphical models. The commonly deployed combination of max-pooling and downsampling in DCNNs achieves invariance but has a toll on localization accuracy. We overcome this by combining the responses at the final DCNN layer with a fully connected Conditional Random Field (CRF), which is shown both qualitatively and quantitatively to improve localization performance. Our proposed “DeepLab” system sets the new state-of-art at the PASCAL VOC-2012 semantic image segmentation task, reaching 79.7 percent mIOU in the test set, and advances the results on three other datasets: PASCAL-Context, PASCAL-Person-Part, and Cityscapes. All of our code is made publicly available online.
A graphical foundation for interleaving in game semantics
In 2007, Harmer, Hyland and Mellies gave a formal mathematical foundation for game semantics using a notion they called a ⊸-schedule, and the similar notion of ⊗-schedule, both structures describing interleavings of plays in games. Their definition was combinatorial in nature, but researchers often draw pictures when describing schedules in practice. Moreover, several proofs of key properties, such as that the composition of ⊸-schedules is associative, involve cumbersome combinatorial detail, whereas in terms of pictures the proof is straightforward, reflecting the geometry of the plane. Here, we give a geometric formulation of ⊸-schedules and ⊗-schedules, prove that they are isomorphic to Harmer et al.'s definitions, and illustrate their value by giving such geometric proofs. Harmer et al.'s notions may be combined to describe plays in multi-component games, and researchers have similarly developed intuitive graphical representations of plays in these games. We give a characterisation of these diagrams and explicitly describe how they relate to the underlying schedules, finally using this relation to provide new, intuitive proofs of key categorical properties.
Cuckoo: A Computation Offloading Framework for Smartphones
Offloading computation from smartphones to remote cloud resources has recently been rediscovered as a technique to enhance the performance of smartphone applications, while reducing the energy usage. In this paper we present the first practical implementation of this idea for Android: the Cuckoo framework, which simplifies the development of smartphone applications that benefit from computation offloading and provides a dynamic runtime system, that can, at runtime, decide whether a part of an application will be executed locally or remotely. We evaluate the framework using two real life applications.
A DISCOURSE-STYLISTIC ANALYSIS OF MOOD STRUCTURES IN SELECTED POEMS OF
From available literature, J.P Clark-Bekederemo's poetry has not been extensively studied from a linguistic perspective. Previous studies on the poet's work have concentrated on the literary and thematic features of the texts. The present study, therefore, examines mood structures (i.e. a grammatical category that pertains to the clause), in the poetry, in order to determine how language is used to express the manner of speaking of interlocutors, and their roles, judgments and attitudes in specific discourse contexts. Through the aid of Halliday's systemic functional Grammar, particularly the tenor aspect of the interpersonal ‘metafunction’ (other metafunctions being ideational and textual), the study highlights the nature of dialogue (i.e. mood structures) between interactants in the poetry, in relation to social contexts. The discourse-stylistic approach adopted for the study, enables us to examine what is communicated (i.e. discourse) and how it is communicated (i.e. stylistics).
Managing Technical Debt in Enterprise Software Packages
We develop an evolutionary model and theory of software technical debt accumulation to facilitate a rigorous and balanced analysis of its benefits and costs in the context of a large commercial enterprise software package. Our theory focuses on the optimization problem involved in managing technical debt, and illustrates the different tradeoff patterns between software quality and customer satisfaction under early and late adopter scenarios at different lifecycle stages of the software package. We empirically verify our theory utilizing a ten year longitudinal data set drawn from 69 customer installations of the software package. We then utilize the empirical results to develop actionable policies for managing technical debt in enterprise software product adoption.
3D Object Modeling and Recognition Using Affine-Invariant Patches and Multi-View Spatial Constraints
This paper presents a novel representation for three-dimensional objects in terms of affine-invariant image patches and their spatial relationships. Multi-view co nstraints associated with groups of patches are combined wit h a normalized representation of their appearance to guide matching and reconstruction, allowing the acquisition of true three-dimensional affine and Euclidean models from multiple images and their recognition in a single photograp h taken from an arbitrary viewpoint. The proposed approach does not require a separate segmentation stage and is applicable to cluttered scenes. Preliminary modeling and recognition results are presented.
Patients' perceptions of information and support received from the nurse specialist during HCV treatment.
AIM To identify patients' perceptions of support received from the nurse specialist during Hepatitis C virus (HCV) treatment. BACKGROUND HCV is a worldwide health problem. However, it is a treatable disease and treatment success rates are high. Unfortunately, treatment comes with a multitude of adverse side effects and patients require informational and psychological support from specialist nurses while on treatment. To date, there is little nursing research on support received from this specialist nursing care. DESIGN This study used a quantitative descriptive design. METHOD A 59-item questionnaire collected data from 106 patients with a diagnosis of HCV attending a HCV outpatient clinic. RESULTS Overall, patients were very satisfied with support received. Advice on contraception was well received. However, many patients did not feel supported with regard to advice on sleep management. There were no statistically significant differences between overall satisfaction and gender, age, genotype and risk factor. However, there were significant correlations found between support received and reported genotype. Those patients presenting with genotype 1, who are mostly infected through blood or blood products, indicated that they require more support in relation to information on side effects of treatment, quality of life and support groups. Specific approaches to support and advice for this cohort may need to be incorporated into current services. CONCLUSION Results of this study reinforce the need for the ongoing use of specialist nurse services and development of this service where no such facilities exist. In addition, the service may need to further recognise and support the information and psychological needs of patients with differing modes of HCV infection. RELEVANCE TO CLINICAL PRACTICE Findings provide information to practising nurse specialists about patient's views of information and support received from nurse specialists in HCV treatment centres and identify where deficits exist.
E-mail Header Injection Vulnerabilities
E-mail Header Injection vulnerability is a class of vulnerability that can occur in web applications that use user input to construct e-mail messages. E-mail Header Injection is possible when the mailing script fails to check for the presence of e-mail headers in user input (either form fields or URL parameters). The vulnerability exists in the reference implementation of the built-in mail functionality in popular languages such as PHP, Java, Python, and Ruby. With the proper injection string, this vulnerability can be exploited to inject additional headers, modify existing headers, and alter the content of the e-mail. ACM CCS: Security and privacy → Software and application security → Web application security
PEDOT:PSS Films with Metallic Conductivity through a Treatment with Common Organic Solutions of Organic Salts and Their Application as a Transparent Electrode of Polymer Solar Cells.
UNLABELLED A transparent electrode is an indispensable component of optoelectronic devices, and there as been a search for substitutes of indium tin oxide (ITO) as the transparent electrode. Poly(3,4-ethylene dioxythiophene):poly(styrenesulfonate) ( PEDOT PSS) is a conducting polymer that is very promising as the next generation of materials for the transparent electrode if it can obtain conductivity as high as that of ITO. Here, we report the treatment of PEDOT PSS with organic solutions to significantly enhance its conductivity. Common organic solvents like dimethylformamide and γ-butyrolactone and common organic salts like methylammonium iodide and methylammonium bromide are used for the organic solutions. The conductivity of pristine PEDOT PSS films is only ∼0.2 S/cm, and it can be increased to higher than 2100 S/cm. The conductivity enhancement is much more significant than control treatments of PEDOT PSS films with neat organic solvents or aqueous solutions of the organic salts. The mechanism for the conductivity enhancement is the synergetic effects of both the organic salts and organic solvents on the microstructure and composition of PEDOT PSS. They induce the segregation of some PSSH chains from PEDOT PSS. Highly conductive PEDOT PSS films were studied as the transparent electrode of polymer solar cells. The photovoltaic efficiency is comparable to that with an ITO transparent electrode.
Incremental Bloom Filters
A bloom filter is a randomized data structure for performing approximate membership queries. It is being increasingly used in networking applications ranging from security to routing in peer to peer networks. In order to meet a given false positive rate, the amount of memory required by a bloom filter is a function of the number of elements in the set. We consider the problem of minimizing the memory requirements in cases where the number of elements in the set is not known in advance but the distribution or moment information of the number of elements is known. We show how to exploit such information to minimize the expected amount of memory required for the filter. We also show how this approach can significantly reduce memory requirement when bloom filters are constructed for multiple sets in parallel. We show analytically as well as experiments on synthetic and trace data that our approach leads to one to three orders of magnitude reduction in memory compared to a standard bloom filter.
Software Architecture as a Set of Architectural Design Decisions
Software architectures have high costs for change, are complex, and erode during evolution. We believe these problems are partially due to knowledge vaporization. Currently, almost all the knowledge and information about the design decisions the architecture is based on are implicitly embedded in the architecture, but lack a first-class representation. Consequently, knowledge about these design decisions disappears into the architecture, which leads to the aforementioned problems. In this paper, a new perspective on software architecture is presented, which views software architecture as a composition of a set of explicit design decisions. This perspective makes architectural design decisions an explicit part of a software architecture. Consequently, knowledge vaporization is reduced, thereby alleviating some of the fundamental problems of software architecture.
A De Broglie-Like Wave in the Planetary Systems
In this work we do an "interpolation" of Scardigli theory of a quantum-like description of the planetary system that reproduces remarkable Titius-Bode-Richardson rule. More precisely, instead of simple, approximate, Bohr-like theory, or, accurate, Schr$\ddot{o}$dinger-like theory, considered by Scardigli, we suggest originally a semi-accurate, de Broglie-like description of the planetary system. Especially, we shall propose a de Broglie-like waves in the planetary systems. More precisely, in distinction from Scardigly (which postulated absence of the interference phenomena at planet orbits) we shall prove that, roughly speaking, planets orbits equal a sum of natural numbers of two types, large and small, of the de-Broglie-like waves. It is similar to well-known situation in atomic physics by interpretation of Bohr momentum quantization postulate by de Broglie relation.
Hypothyroidism: an update.
Hypothyroidism is a clinical disorder commonly encountered by the primary care physician. Untreated hypothyroidism can contribute to hypertension, dyslipidemia, infertility, cognitive impairment, and neuromuscular dysfunction. Data derived from the National Health and Nutrition Examination Survey suggest that about one in 300 persons in the United States has hypothyroidism. The prevalence increases with age, and is higher in females than in males. Hypothyroidism may occur as a result of primary gland failure or insufficient thyroid gland stimulation by the hypothalamus or pituitary gland. Autoimmune thyroid disease is the most common etiology of hypothyroidism in the United States. Clinical symptoms of hypothyroidism are nonspecific and may be subtle, especially in older persons. The best laboratory assessment of thyroid function is a serum thyroid-stimulating hormone test. There is no evidence that screening asymptomatic adults improves outcomes. In the majority of patients, alleviation of symptoms can be accomplished through oral administration of synthetic levothyroxine, and most patients will require lifelong therapy. Combination triiodothyronine/thyroxine therapy has no advantages over thyroxine monotherapy and is not recommended. Among patients with subclinical hypothyroidism, those at greater risk of progressing to clinical disease, and who may be considered for therapy, include patients with thyroid-stimulating hormone levels greater than 10 mIU per L and those who have elevated thyroid peroxidase antibody titers.
Lymphomatoid papulosis and associated lymphomas: a retrospective case series of 84 patients.
AIMS To determine incidence and risk factors for developing lymphoma in patients with lymphomatoid papulosis (LyP), and to identify putative triggers amenable to treatment. METHODS The prognostic effect of severity of LyP, viral infection by history or serology, beta-2-microglobulin level, lactic dehydrogenase (LDH) level, and CD4 : CD8 ratio were evaluated using logistic regression models. Responses to prophylactic or palliative treatment were assessed. RESULTS In total, 84 patients (38 men, 46 women, median age at diagnosis 48.5 years) were identified. Of these, 34 (40%) were also diagnosed with one or more lymphomas: 16 (19%) had mycosis fungoides, 15 (17%) had primary cutaneous anaplastic large-cell lymphoma (pcALCL), 2 had both MF and pcALCL, and 1 had MF with large cell transformation and 1 had peripheral T-cell lymphoma. Of the 61 people presenting with LyP alone, only 11 (18%) subsequently developed lymphoma, with a median onset of 17.6 years (95% CI: 4, not obtained). Men were 2.5 times more likely than women to develop lymphoma (P = 0.04). Exposure to Epstein-Barr virus (EBV) was associated with an increase in incidence of 4.8 times (P = 0.16). Treatment for a putative infectious trigger resulted in improvement for 15 of 24 patients (63%). CONCLUSION Referral bias may explain the higher (40%) incidence of lymphoma in this population of LyP patients, compared with the 10-20% incidence commonly cited in the literature. In the subset of patients presenting with LyP alone, only 18% later developed lymphoma. Male patients or patients with prior EBV infection may have a higher risk for developing lymphoma, and some patients improved with treatment of putative infectious triggers.
SpiderMAV: Perching and stabilizing micro aerial vehicles with bio-inspired tensile anchoring systems
Whilst Micro Aerial Vehicles (MAVs) possess a variety of promising capabilities, their high energy consumption severely limits applications where flight endurance is of high importance. Reducing energy usage is one of the main challenges in advancing aerial robot utility. To address this bottleneck in the development of unmanned aerial vehicle applications, this work proposes an bioinspired mechanical approach and develops an aerial robotic system for greater endurance enabled by low power station-keeping. The aerial robotic system consists of an multirotor MAV and anchoring modules capable of launching multiple tensile anchors to fixed structures in its operating envelope. The resulting tensile perch is capable of providing a mechanically stabilized mode for high accuracy operation in 3D workspace. We explore generalised geometric and static modelling of the stabilisation concept using screw theory. Following the analytical modelling of the integrated robotic system, the tensile anchoring modules employing high pressure gas actuation are designed, prototyped and then integrated to a quadrotor platform. The presented design is validated with experimental tests, demonstrating the stabilization capability even in a windy environment.
Understanding optimal caching and opportunistic caching at "the edge" of information-centric networks
A formal framework is presented for the characterization of cache allocation models in Information-Centric Networks (ICN). The framework is used to compare the performance of optimal caching everywhere in an ICN with opportunistic caching of content only near its consumers. This comparison is made using the independent reference model adopted in all prior studies, as well as a new model that captures non-stationary reference locality in space and time. The results obtained analytically and from simulations show that optimal caching throughout an ICN and opportunistic caching at the edge routers of an ICN perform comparably the same. In addition caching content opportunistically only near its consumers is shown to outperform the traditional on-path caching approach assumed in most ICN architectures in an unstructured network with arbitrary topology represented as a random geometric graph.
When symptoms persist: choosing among alternative somatic treatments for schizophrenia.
Many patients with schizophrenia continue to have significant disabling symptoms despite adequate trials of different types and doses of traditional neuroleptics. Clinicians treating these neuroleptic-resistant patients must look to other treatments in the hope of providing some relief. The literature on many of the alternative treatments is too scanty for firm conclusions. We offer criteria for deciding which treatments may warrant consideration. We review the evidence for the eight treatments we found to meet these criteria and discuss clinical points salient to their use in this population. Although not always conclusive, the data do offer clues for treatment guidelines and an approach to choosing among the available treatments is suggested.
Framing Public Opinion in Competitive Democracies
What is the effect of democratic competition on the power of elites to frame public opinion? We address this issue first by defining the range of competitive contexts that might surround any debate over a policy issue. We then offer a theory that predicts how audiences, messages, and competitive environments interact to influence the magnitude of framing effects. These hypotheses are tested using experimental data gathered on the opinions of adults and college students toward two policy issues—–the management of urban growth and the right of an extremist group to conduct a rally. Our results indicate that framing effects depend more heavily on the qualities of frames than on their frequency of dissemination and that competition alters but does not eliminate the influence of framing. We conclude by discussing the implications of these results for the study of public opinion and democratic political debate.
Tropospheric response to stratospheric perturbations in a relatively simple general circulation model
[1] The sensitivity of the tropospheric extratropical circulation to thermal perturbations of the polar stratosphere is examined in a dry primitive equation general circulationmodel with zonally symmetric forcing and boundary conditions. For sufficiently strong cooling of the polar winter stratosphere, the winter-hemisphere tropospheric jet shifts polewards and strengthens markedly at the surface; this is accompanied by a drop in surface pressure at high latitudes in the same hemisphere. In addition, this extratropical tropospheric response is found to be very similar to the model’s leading pattern of internal variability. These results are tested for robustness at several horizontal and vertical resolutions, and the same tropospheric response is observed at all but the lowest resolution tested. The behavior of this relatively simple model is broadly consistent with recent observational and modeling studies of trends in extratropical atmospheric variability.
Dining Cryptographers Revisited
Dining cryptographers networks (or DC-nets) are a privacypreserving primitive devised by Chaum for anonymous message publication. A very attractive feature of the basic DC-net is its non-interactivity. Subsequent to key establishment, players may publish their messages in a single broadcast round, with no player-to-player communication. This feature is not possible in other privacy-preserving tools like mixnets. A drawback to DC-nets, however, is that malicious players can easily jam them, i.e., corrupt or block the transmission of messages from honest parties, and may do so without being traced. Several researchers have proposed valuable methods of detecting cheating players in DC-nets. This is usually at the cost, however, of multiple broadcast rounds, even in the optimistic case, and often of high computational and/or communications overhead, particularly for fault recovery. We present new DC-net constructions that simultaneously achieve noninteractivity and high-probability detection and identification of cheating players. Our proposals are quite efficient, imposing a basic cost that is linear in the number of participating players. Moreover, even in the case of cheating in our proposed system, just one additional broadcast round suffices for full fault recovery. Among other tools, our constructions employ bilinear maps, a recently popular cryptographic technique for reducing communication complexity.
Scalable Big Data Architecture
Scalable Big Data Architecture covers real-world, concrete industry use cases that leverage complex distributed applications, which involve web applications, RESTful API, and high throughput of large amount of data stored in highly scalable NoSQL data stores such as Couchbase and Elasticsearch. This book demonstrates how data processing can be done at scale from the usage of NoSQL datastores to the combination of big data distribution.
- 13086 MICROSURGICAL DEVICES BY POP-UP BOOK MEMS
The small scale of microsurgery poses significant challenges for developing robust and dexterous tool s t grip, cut, and join sub-millimeter structures such as ves sels and nerves. The main limitation is that traditional man ufacturing techniques are not optimized to create smart, artic ulating structures in the 0.1 – 10 mm scale. Pop-up book ME MS is a new fabrication technology that promises to overcom e this challenge and enable the monolithic fabrication of c mplex, articulated structures with an extensive catalog of materials, embedded electrical components, and automated assem bly with feature sizes down to 20 microns. In this paper, we demonstrate a proof-of-concept microsurgical gripper and evalua te its performance at the component and device level to ch aracterize its strength and robustness. 1-DOF Flexible hinge j oints that constrain motion and allow for out-of-plane actuati on were found to resist torsional loads of 22.8±2.15 N ·mm per mm of hinge width. Adhesive lap joints that join individu al layers in the laminate structure demonstrated a shear strengt h of 26.8±0.53 N/mm. The laminate structures were also shown to resist peel loads of 0.72±0.10 N/mm . Various flexible hinge and adhesive lap components were then designed into an 11layered structure which ‘pops up’ to realize an art iculating microsurgical gripper that includes a cable-driven mechanism for gripping actuation and a flexural return spring to passively open the gripper. The gripper prototype, with final weight of 200 mg, overall footprint of 18 mm by 7.5 mm, and features as small as 200 microns, is able to deftly manipulate objects 100 times is own weight at the required scale, thus dem onstrating its potential for use in microsurgery. INTRODUCTION Small joint surgery, such as that in the wrist or f ingers, presents a number of significant challenges due to the limited maneuverable workspace and the presence of many del icate structures that must be avoided, including sensitiv e cartilage surfaces and tendons [1]. Current commercial smalljoint surgical instruments are limited to straight, simpl e tools without any distal articulation which would allow for great r access and dexterity inside the joint [2]. In addition, the ro bust electromechanical surgical tools at the sub-mm scal e required for these procedures are either impossible or comme rcially impractical to make with existing manufacturing tec hniques such as surface/bulk micromachining [3], wire-EDM [ 4], microinjection molding, or micromilling/lathing [5]. It is our goal to apply an emerging micromachining and assembly techn ique that we have developed to enable robust, dexterous, and practical microsurgical instruments for small joint repair. We have developed a novel micro-manufacturing techn ique known as Pop-Up Book MEMS (‘Pop-Ups’) that allows f or the fabrication of complex, multi-functional electromec hanical devices on the 0.1-10 mm scale [6] [7]. Pop-Up tec hnology enables the ability to create 3-D, multi-material, monolithic meso and micro-structures using purely 2-D planar
So-Called Non-Subsective Adjectives
The interpretation of adjective-noun pairs plays a crucial role in tasks such as recognizing textual entailment. Formal semantics often places adjectives into a taxonomy which should dictate adjectives’ entailment behavior when placed in adjective-noun compounds. However, we show experimentally that the behavior of subsective adjectives (e.g. red) versus non-subsective adjectives (e.g. fake) is not as cut and dry as often assumed. For example, inferences are not always symmetric: while ID is generally considered to be mutually exclusive with fake ID, fake ID is considered to entail ID. We discuss the implications of these findings for automated natural language understanding.
Noise cancellation of memristive neural networks
This paper investigates noise cancellation problem of memristive neural networks. Based on the reproducible gradual resistance tuning in bipolar mode, a first-order voltage-controlled memristive model is employed with asymmetric voltage thresholds. Since memristive devices are especially tiny to be densely packed in crossbar-like structures and possess long time memory needed by neuromorphic synapses, this paper shows how to approximate the behavior of synapses in neural networks using this memristive device. Also certain templates of memristive neural networks are established to implement the noise cancellation.
The Interdisciplinary Curriculum for Oncology Palliative Care Education (iCOPE): meeting the challenge of interprofessional education.
UNLABELLED Background: Interprofessional education is necessary to prepare students of the health professions for successful practice in today's health care environment. Because of its expertise in interdisciplinary practice and team-based care, palliative care should be leading the way in creating educational opportunities for students to learn the skills for team practice and provision of quality patient-centered care. Multiple barriers exist that can discourage those desiring to create and implement truly interdisciplinary curriculum. DESIGN An interdisciplinary faculty team planned and piloted a mandatory interdisciplinary palliative oncology curriculum and responded to formative feedback. SETTING/SUBJECTS The project took place at a large public metropolitan university. Medical, nursing, and social work students and chaplains completing a clinical pastoral education internship participated in the curriculum. MEASUREMENTS Formative feedback was received via the consultation of an interdisciplinary group of palliative education experts, focus groups from students, and student evaluations of each learning modality. RESULTS Multiple barriers were experienced and successfully addressed by the faculty team. Curricular components were redesigned based on formative feedback. Openness to this feedback coupled with flexibility and compromise enabled the faculty team to create an efficient, sustainable, and feasible interdisciplinary palliative oncology curriculum. CONCLUSION Interdisciplinary palliative education can be successful if faculty teams are willing to confront challenges, accept feedback on multiple levels, and compromise while maintaining focus on desired learner outcomes.
BigDataScript: a scripting language for data pipelines
MOTIVATION The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. RESULTS We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. AVAILABILITY AND IMPLEMENTATION BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript.
A method of driving fatigue detection based on eye location
For a driver fatigue monitoring system, one of the most important problems to solve is eye location. This paper presents a method of eye location based on AAM and a method of driver fatigue detection based on an improved method to calculate PERCOLS. AAM (Active Appearance Model) model is composited by the appearance model and fitting algorithm which is an efficient algorithm to extract facial feature points. First establish the AAM model, then use the fitting algorithm to detect the position of face and locate the position of eye, next calculate the improved physical PERCLOS which is used to measure the driver fatigue to determine the current state of driver is fatigue or not.
Secure Session Mobility using Hierarchical Authentication Key Management in Next Generation Networks
In this paper we propose a novel authentication mechanism for session mobility in Next Generation Networks named as Hierarchical Authentication Key Management (HAKM). The design objectives of HAKM are twofold: i) to minimize the authentication latency in NGNs; ii) to provide protection against an assortment of attacks such as denial-of-service attacks, man-in-the-middle attacks, guessing attacks, and capturing node attacks. In order to achieve these objectives, we combine Session Initiation Protocol (SIP) with Hierarchical Mobile IPv6 (HMIPv6) to perform local authentication for session mobility. The concept of group keys and pairwise keys with one way hash function is employed to make HAKM vigorous against the aforesaid attacks. The performance analysis and numerical results demonstrate that HAKM outperforms the existing approaches in terms of latency and protection against the abovementioned attacks.
New approach for liveness detection in fingerprint scanners based on valley noise analysis
Recently, research has shown that it is possible to spoof a variety of fingerprint scanners using some simple techniques with molds made from plastic, clay, Play-Doh, silicone or gelatin materials. To protect against spoofing, methods of liveness detection measure physiological signs of life from fingerprints ensuring only live fingers are captured for enrollment or authentication. In this paper, a new liveness detection method is proposed which is based on noise analysis along the valleys in the ridge-valley structure of fingerprint images. Unlike live fingers which have a clear ridge-valley structure, artificial fingers have a distinct noise distribution due to the material’s properties when placed on a fingerprint scanner. Statistical features are extracted in multiresolution scales using wavelet decomposition technique. Based on these features, liveness separation (live/non-live) is performed using classification trees and neural networks. We test this method on the dataset which contains about 58 live, 80 spoof (50 made from Play-Doh and 30 made from gelatin), and 25 cadaver subjects for 3 different scanners. Also, we test this method on a second dataset which contains 28 live and 28 spoof (made from silicone) subjects. Results show that we can get approximately 90.9-100% classification of spoof and live fingerprints. The proposed liveness detection method is purely software based and application of this method can provide anti-spoofing protection for fingerprint scanners.
What Shall We Tell the Children? The Press Encounters Columbus
NATIONS OUTGROW HEROES, the way children must learn to live without Santa. Lenin is being toppled in Russia. The Federal government removed Custer's name from his national park. It is now Christopher Columbus's turn at the chopping block. Journalists, editorial writers, and talk show hosts suddenly see in Columbus a hot prospect, now that he has become "controversial."This article looks at some of the opening skirmishes of this surprising Quincentenary year conflict over a hero, as presented in the press, with a special focus on how this conflict has become the bugbear of right-wing commentators. James Axtel, in 1987, sounded one of the first scholarly warnings that all was not well with how events surrounding 1492 are presented in American history textbooks. His reading of the most popular college texts revealed gross distortions of the reality of the "discovery." He advised school boards and teachers to stop adopting textbooks "that are hopelessly outdated, stylistically painful, and cratered with crucial omissions."' The academic community was not prepared to make an adequate response to his critique. Perhaps too many specialists in America and Europe saw in the upcoming anniversary of the first Atlantic crossing an opportunity to gain funding for big projects such as getting the NuovaRaccolta Colombiana2 translated, producing a fresh twelve-volume collection of the major
Sub-nanosecond avalanche transistor drivers for low impedance pulsed power applications
Ultra compact, short pulse, high voltage, high current pulsers are needed for a variety of non-linear electrical and optical applications. With a fast risetime and short pulse width, these drivers are capable of producing sub-nanosecond electrical and thus optical pulses by gain switching semiconductor laser diodes. Gain-switching of laser diodes requires a sub-nanosecond pulser capable of driving a low output impedance (5 /spl Omega/ or less). Optical pulses obtained had risetimes as fast as 20 ps. The designed pulsers also could be used for triggering photo-conductive semiconductor switches (PCSS), gating high speed optical imaging systems, and providing electrical and optical sources for fast transient sensor applications. Building on concepts from Lawrence Livermore National Laboratory, the development of pulsers based on solid state avalanche transistors was adapted to drive low impedances. As each successive stage is avalanched in the circuit, the amount of overvoltage increases, increasing the switching speed and improving the turn on time of the output pulse at the final stage. The output of the pulser is coupled into the load using a Blumlein configuration.
Analytical techniques for quantification of amorphous/crystalline phases in pharmaceutical solids.
The existence of different solid-state forms such as polymorphs, solvates, hydrates, and amorphous form in pharmaceutical drug substances and excipients, along with their downstream consequences in drug products and biological systems, is well documented. Out of these solid states, amorphous systems have attracted considerable attention of formulation scientists for their specific advantages, and their presence, either by accident or design is known to incorporate distinct properties in the drug product. Identification of different solid-state forms is crucial to anticipate changes in the performance of the material upon storage and/or handling. Quantitative analysis of physical state is imperative from the viewpoint of both the manufacturing and the regulatory control aimed at assuring safety and efficacy of drug products. Numerous analytical techniques have been reported for the quantification of amorphous/crystalline phase, and implicit in all quantitative options are issues of accuracy, precision, and suitability. These quantitative techniques mainly vary in the properties evaluated, thus yielding divergent values of crystallinity for a given sample. The present review provides a compilation of the theoretical and practical aspects of existing techniques, thereby facilitating the selection of an appropriate technique to accomplish various objectives of quantification of amorphous systems.
Fine-Grained Land Use Classification at the City Scale Using Ground-Level Images
We perform fine-grained land use mapping at the city scale using ground-level images. Mapping land use is considerably more difficult than mapping land cover and is generally not possible using overhead imagery as it requires close-up views and seeing inside buildings. We postulate that the growing collections of georeferenced, ground-level images suggest an alternate approach to this geographic knowledge discovery problem. We develop a general framework that uses Flickr images to map 45 different land-use classes for the City of San Francisco. Individual images are classified using a novel convolutional neural network containing two streams, one for recognizing objects and another for recognizing scenes. This network is trained in an end-to-end manner directly on the labeled training images. We propose several strategies to overcome the noisiness of our user-generated data including search-based training set augmentation and online adaptive training. We derive a ground truth map of San Francisco in order to evaluate our method. We demonstrate the effectiveness of our approach through geo-visualization and quantitative analysis. Our framework achieves over 29% recall at the individual land parcel level which represents a strong baseline for the challenging 45-way land use classification problem especially given the noisiness of the image data.
Psychological testing and assessment in the 21st century.
As spin-offs of the current revolution in the cognitive and neurosciences, clinical neuropsychologists in the 21st century will be using biological tests of intelligence and cognition that record individual differences in brain functions at the neuromolecular, neurophysiologic, and neurochemical levels. Assessment of patients will focus more on better use of still intact functions, as well as rehabilitating or bypassing impaired functions, than on diagnosis, as is the focus today. Better developed successors to today's scales for assessing personal competency and adaptive behavior, as well as overall quality of life, also will be in wide use in clinical settings. With more normal individuals, use of new generations of paper-and-pencil inventories, as well as biological measures for assessing differences in interests, attitudes, personality styles, and predispositions, is predicted.
Hardware Implementation of the Time-Triggered Ethernet Controller
Time-triggered (TT) Ethernet is a novel communication system that integrates real-time and non-real-time traffic into a single communication architecture. A TT Ethernet system consists od a set of nodes interconnected by a specific switch called TT Ethernet switch. A node consist of a TT Ethernet communication controller that executes the TT Ethernet protocol and a host computer that executes the user application. The protocol distinguishes between event-triggered (ET) and time-triggered (TT) Ethernet traffic. Time-triggered traffic is scheduled and transmitted with a predictable transmission delay, whereas event-triggered traffic is transmitted on a best-effort basis. The event-triggered traffic in TT Ethernet is handled in conformance with the existing Ethernet standards of the IEEE. This paper presents the design of the TT Ethernet communication controller optimized to be implemented in hardware. The paper describes a prototypical implementation using a custom built hardware platform and presents the results of evaluation experiments.
Students' perceptions of Facebook for academic purposes
Facebook is the most popular Social Network Site (SNS) among college students. Despite the popularity and extensive use of Facebook by students, its use has not made significant inroads into classroom usage. In this study, we seek to examine why this is the case and whether it would be worthwhile for faculty to invest the time to integrate Facebook into their teaching. To this end, we decided to undertake a study with a sample of 214 undergraduate students at the University of Huelva (Spain). We applied the structural equation model specifically designed by Mazman and Usluel (2010) to identify the factors that may motivate these students to adopt and use social network tools, specifically Facebook, for educational purposes. According to our results, Social Influence is the most important factor in predicting the adoption of Facebook; students are influenced to adopt it to establish or maintain contact with other people with whom they share interests. Regarding the purposes of Facebook usage, Social Relations is perceived as the most important factor among all of the purposes collected. Our findings also revealed that the educational use of Facebook is explained directly by its purposes of usage and indirectly by its adoption. 2013 Elsevier Ltd. All rights reserved.
Two-factor authentication: too little, too late
T wo-factor authentication isn't our savior. It won't defend against phishing. It's not going to prevent identity theft. It's not going to secure online accounts from fraudulent transactions. It solves the security problems we had 10 years ago, not the security problems we have today. The problem with passwords is that it is too easy to lose control of them. People give their passwords to other people. People write them down, and other people read them. People send them in email, and that email is intercepted. People use them to log into remote servers, and their communications are eavesdropped on. Passwords are also easy to guess. And once any of that happens, the password no longer works as an authentication token because you can never be sure who is typing in that password. Two-factor authentication mitigates this problem. If your password includes a number that changes every minute, or a unique reply to a random challenge, then it's difficult for someone else to intercept. You can't write down the ever-changing part. An intercepted password won't be usable the next time it's needed. And a two-factor password is more difficult to guess. Sure, someone can always give his password and token to his secretary, but no solution is foolproof. These tokens have been around for at least two decades, but it's only recently that they have received mass-market attention. AOL is rolling them out. Some banks are issuing them to customers , and even more are talking about doing it. It seems that corporations are finally recognizing the fact that passwords don't provide adequate security , and are hoping that two-factor authentication will fix their problems. Unfortunately, the nature of attacks has changed over those two decades. Back then, the threats were all passive: eavesdropping and offline password guessing. Today, the threats are more active: phishing and Trojan horses. Two new active attacks we're starting to see include: Man-in-the-Middle Attack. An attacker puts up a fake bank Web site and entices a user to that Web site. The user types in his password, and the attacker in turn uses it to access the bank's real Web site. Done correctly, the user will never realize that he isn't at the bank's Web site. Then the attacker either disconnects the user and makes any fraudulent transactions he wants, or passes along the user's banking transactions while making his own transactions at the same …
Modeling Satire in English Text for Automatic Detection
According to the Merriam-Webster dictionary, satire is a trenchant wit, irony, or sarcasm used to expose and discredit vice or folly. Though it is an important language aspect used in everyday communication, the study of satire detection in natural text is often ignored. In this paper, we identify key value components and features for automatic satire detection. Our experiments have been carried out on three datasets, namely, tweets, product reviews and newswire articles. We examine the impact of a number of state-of-the-art features as well as new generalized textual features. By using these features, we outperform the state of the art by a significant 6% margin.
Visible Machine Learning for Biomedicine
A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology.
Cerebral neoplastic enhancing lesions: multicenter, randomized, crossover intraindividual comparison between gadobutrol (1.0M) and gadoterate meglumine (0.5M) at 0.1 mmol Gd/kg body weight in a clinical setting.
OBJECTIVE Two macrocyclic extracellular contrast agents, one-molar neutral gadobutrol and ionic gadoterate meglumine, were compared to determine the overall preference for one or the other in a clinical setting. MATERIALS AND METHODS Multicenter, randomized, single-blind, intra-individually controlled, comparison study with a corresponding blinded read. Efficacy analysis was based on 136 patients who underwent identical MRI examinations: group A first received 1.0M gadobutrol followed by 0.5M gadoterate meglumine 48 h to 7 days later; group B had a reversed administration order. Three independent blinded readers assessed off-site their overall diagnostic preference (primary efficacy parameter) based on a matched pairs approach. RESULTS Superiority of gadobutrol over gadoterate meglumine was demonstrated for the qualitative assessment of overall preference across all readers by a statistically significant difference between both contrast agents for this primary endpoint. Preferences in lesion enhancement (secondary endpoint) were also found significantly in favor of gadobutrol. For preference in lesion delineation from surrounding tissue/edema and for internal structure only a trend towards a higher proportion for gadobutrol was found (except for internal structure reported by one reader, which showed a result of statistical significance). Lesion contrast and relative lesion enhancement (quantitative parameters) were statistically significantly higher for gadobutrol compared to gadoterate meglumine. CONCLUSION Contrast-enhanced MRI of neoplastic brain lesions at a dose of 0.1 mmol Gd/kg body weight, assessed in a standardized off-site blinded reading, results in a significantly higher qualitative and quantitative preference for gadobutrol compared to gadoterate meglumine.
All work and no play: Measuring fun, usability, and learning in software for children
This paper describes an empirical study of fun, usability, and learning in educational software. Twenty Wve children aged 7 and 8 from an English primary school participated. The study involved three software products that were designed to prepare children for government initiated science tests. Pre and post tests were used to measure the learning eVect, and observations and survey methods were used to assess usability and fun. The Wndings from the study demonstrate that in this instance learning was not correlated with fun or usability, that observed fun and observed usability were correlated, and that children of this age appeared to be able to diVerentiate between the constructs used to describe software quality. The Fun Sorter appears to be an eVective tool for evaluating products with children. The authors discuss the implications of the results, oVer some thoughts on designing experiments with children, and propose some ideas for future work.  2005 Elsevier Ltd. All rights reserved.
Prevalence of Peri-implantitis in Medically Compromised Patients and Smokers: A Systematic Review.
PURPOSE To verify whether the diversity of systemic medical conditions and smoking act as biologic associated factors for peri-implantitis. MATERIALS AND METHODS The PICO question was: "In patients with osseointegrated dental implants, does the presence of smoking habits or a compromised medical status influence the occurrence of peri-implantitis compared with the presence of good general health?" Smoking and systemic conditions such as type 2 diabetes mellitus, cardiovascular diseases, rheumatoid arthritis, lung diseases, obesity, cancer, deep depression, and osteoporosis were screened. Selection criteria included at least 10 patients per condition, 1 year of follow-up after implant loading, and strict cutoff levels (probing pocket depth [PPD], bleeding on probing [BOP] and/or pus, marginal bone loss) to define peri-implantitis. RESULTS From the 1,136 records initially retrieved, 57 were selected after title and abstract analyses. However, only six papers were considered for qualitative evaluation. No randomized controlled clinical trial was found. Smoking was associated with peri-implantitis in only one out of four studies. Poorly controlled type 2 diabetes accentuated only PPD and radiographic marginal bone level prevalence rates in peri-implant patients (one study). Cardiovascular disease was considered a risk (one out of two studies). The chance of peri-implant patients harboring the Epstein-Barr virus was threefold in one report. No associations were found for rheumatoid arthritis. CONCLUSION Data from existing studies point to smoking and diabetes as biologic associated factors for peri-implantitis. However, the body of evidence is still immature, and the specific contribution of general health problems to peri-implantitis requires additional robust epidemiologic and clinical investigations.
SAC : G : 3-D Cellular Automata based PRNG
Random numbers are critical in many areas like security, simulation, gaming and even gambling. Random numbers are basically of two types: true random numbers and pseudorandom numbers. In this paper we propose a three dimensional cellular automata (3-D CA) based pseudo-random number generator (PRNG). Cellular Automata (CA) is used in pseudo-random number generators to produce high-rate random numbers. However, the randomness of such numbers directly depends upon the CA rules and the number of neighbor cells. The two-dimensional (2-D) CA have several limitations such as finding the best CA rules, boundary cell problems, etc. To address the problems existing in the 2-D CA, we propose a random number generator based on 3-D cellular automata. The proposed generator is based on the rule numbers 43, 85, 170 and 201, and on incremental boundary conditions. The output bits are then passed to the well-known Diehard, ENT and NIST test suites to test its randomness. The results reveal that the bit stream generated by the proposed scheme passes all the tests present in the test suites.
Raman spectroscopy investigations of TiB x C y N z coatings deposited by low pressure chemical vapor deposition
Abstract TiB x C y N z coatings have been prepared applying LPCVD and characterized using SEM/EDX, XRD, and Raman micro-spectroscopy. It has been shown that first-order, defect-induced Raman spectra of good quality can be obtained from TiB x C y N z coatings, even if buried within a multilayer stack. The Raman peak assignments fit well with previous work on TiC 1 −  x N x . Even small changes in the B:C:N ratio result in systematical shifts of the Raman peaks. With increasing nitrogen content, the acoustical phonons shift to lower frequencies. A high correlation of the Raman shifts with lattice constants derived from XRD has also been found. Additionally, intensity and FWHM of the Raman peaks also change going from carbon- to nitrogen-rich coatings. The sensitivity of the TA peak Raman shifts to changes of the investigated basic coating properties is largest for N-rich coatings. Looking at the full range of coatings the dependence of the Raman shifts is slightly nonlinear. The present work establishes Raman microscopy as a complementary non-destructive technique compared to XRD for studying coatings like TiB x C y N z . Structural, optical and chemical properties can be determined with considerably higher spatial resolution.
Breast cancer presentation and diagnostic delays in young women.
BACKGROUND Young women may experience delays in diagnosis of breast cancer, and these delays may contribute to poorer outcomes. METHODS In a prospective, multicenter cohort study, women recently diagnosed with breast cancer at age ≤40 years were surveyed regarding their initial signs or symptoms of cancer and delays in diagnosis. Self delay was defined as ≥90 days between the first sign or symptom and a patient's first visit to consult a health care provider. Care delay was defined as ≥90 days between that first visit and the diagnosis of breast cancer. In a medical record review, tumor characteristics were assessed, including disease stage. Univariate and multivariate models were used to assess for predictors of self delay, care delay, and advanced stage in the self-detected subset. RESULTS In 585 eligible participants, the first sign or symptom of cancer was a self-detected breast abnormality for 80%, a clinical breast examination abnormality for 6%, an imaging abnormality for 12%, and a systemic symptom for 1%. Among women with self-detected cancers, 17% reported a self delay, and 12% reported a care delay. Self delays were associated with poorer financial status (P = 0.01). Among young women with self-detected breast cancers, care delay was associated at trend level (P = .06) with higher stage in multivariate modeling. CONCLUSIONS Most young women detect their own breast cancers, and most do not experience long delays before diagnosis. Women with fewer financial resources are more likely to delay seeking medical attention for a self-detected breast abnormality.
Cross versus Within-Company Cost Estimation Studies: A Systematic Review
The objective of this paper is to determine under what circumstances individual organizations would be able to rely on cross-company-based estimation models. We performed a systematic review of studies that compared predictions from cross-company models with predictions from within-company models based on analysis of project data. Ten papers compared cross-company and within-company estimation models; however, only seven presented independent results. Of those seven, three found that cross-company models were not significantly different from within-company models, and four found that cross-company models were significantly worse than within-company models. Experimental procedures used by the studies differed making it impossible to undertake formal meta-analysis of the results. The main trend distinguishing study results was that studies with small within-company data sets (i.e., $20 projects) that used leave-one-out cross validation all found that the within-company model was significantly different (better) from the cross-company model. The results of this review are inconclusive. It is clear that some organizations would be ill-served by cross-company models whereas others would benefit. Further studies are needed, but they must be independent (i.e., based on different data bases or at least different single company data sets) and should address specific hypotheses concerning the conditions that would favor cross-company or within-company models. In addition, experimenters need to standardize their experimental procedures to enable formal meta-analysis, and recommendations are made in Section 3.
SSD as SQLite Engine
As a proof-of-concept for the vision “SSD as SQL Engine” (SaS in short), we demonstrate that SQLite [4], a popular mobile database engine, in its entirety can run inside a real SSD development platform. By turning storage device into database engine, SaS allows applications to directly interact with full SQL database server running inside storage device. In SaS, the SQL language itself, not the traditional dummy block interface, will be provided as new interface between applications and storage device. In addition, since SaS plays the role of the uni ed platform of database computing node and storage node, the host and the storage need not be segregated any more as separate physical computing components.
A robust self-tuning scheme for PI- and PD-type fuzzy controllers
We propose a simple but robust model independent self-tuning scheme for fuzzy logic controllers (FLC’s ). Here, the output scaling factor (SF) is adjusted on-line by fuzzy rules according to the current trend of the controlled process. The rulebase for tuning the output SF is defined on error(e) and change of error ( e) of the controlled variable using the most natural and unbiased membership functions (MF’s). The proposed selftuning technique is applied to both PIand PD-type FLC’s to conduct simulation analysis for a wide range of different linear and nonlinear second-order processes including a marginally stable system where even the well known Ziegler–Nichols tuned conventional PI or PID controllers fail to provide an acceptable performance due to excessively large overshoot. Performances of the proposed self-tuning FLC’s are compared with those of their corresponding conventional FLC’s in terms of several performance measures such as peak overshoot, settling time, rise time, integral absolute error (IAE) and integral-of-timemultiplied absolute error (ITAE), in addition to the responses due to step set-point change and load disturbance and, in each case, the proposed scheme shows a remarkably improved performance over its conventional counterpart.
Weather Forecasting using Incremental K-means Clustering
Clustering is a powerful tool which has been used in several forecasting works, such as time series forecasting, real time storm detection, flood forecasting and so on. In this paper, a generic methodology for weather forecasting is proposed by the help of incremental K-means clustering algorithm. Weather forecasting plays an important role in day to day applications.Weather forecasting of this paper is done based on the incremental air pollution database of west Bengal in the years of 2009 and 2010. This paper generally uses typical Kmeans clustering on the main air pollution database and a list of weather category will be developed based on the maximum mean values of the clusters.Now when the new data are coming, the incremental K-means is used to group those data into those clusters whose weather category has been already defined. Thus it builds up a strategy to predict the weather of the upcoming data of the upcoming days. This forecasting database is totally based on the weather of west Bengal and this forecasting methodology is developed to mitigating the impacts of air pollutions and launch focused modeling computations for prediction and forecasts of weather events. Here accuracy of this approach is also measured.
Late antibody-mediated rejection after heart transplantation: Mortality, graft function, and fulminant cardiac allograft vasculopathy.
BACKGROUND Late antibody-mediated rejection (AMR) after heart transplantation is suspected to be associated with a poor short-term prognosis. METHODS A retrospective single-center observational study was performed. Late AMR was defined as AMR occurring at least 1 year after heart transplantation. The study included all consecutive patients with proven and treated late acute AMR at the authors' institution between November 2006 and February 2013. The aim was to analyze the prognosis after late AMR, including mortality, recurrence of AMR, left ventricular ejection fraction, and cardiac allograft vasculopathy (CAV). Selected endomyocardial biopsy specimens obtained before AMR were also blindly reviewed to identify early histologic signs of AMR. RESULTS The study included 20 patients treated for late AMR. Despite aggressive immunosuppressive therapies (100% of patients received intravenous methylprednisolone, 90% received intravenous immunoglobulin [IVIg],85% received plasmapheresis, 45% received rituximab), the prognosis remained poor. Survival after late AMR was 80% at 1 month, 60% at 3 months, and 50% at 1 year. All early deaths (<3 months, n = 8) were directly attributable to graft dysfunction or to complication of the intense immunosuppressive regimen. Among survivors at 3 months (n = 12), histologic persistence or recurrence of AMR, persistent left ventricular dysfunction, and fulminant CAV were common (33%, 33%, and 17% of patients). Microvascular inflammation was detected in at least 1 biopsy specimen obtained before AMR in 13 patients (65%). CONCLUSIONS Prognosis after late AMR is poor despite aggressive immunosuppressive therapies. Fulminant CAV is a common condition in these patients. Microvascular inflammation is frequent in endomyocardial biopsy specimens before manifestation of symptomatic AMR.
Virtual interference study for FMCW and PMCW radar
Mutual interference of radar systems has been identified as one of the major challenges for future automotive radar systems. In this work the interference of frequency (FMCW) and phase modulated continuous wave (PMCW) systems is investigated by means of simulations. All twofold combinations of the aforementioned systems are considered. The interference scenario follows a typical use-case from the well-known MOre Safety for All by Radar Interference Mitigation (MOSARIM) study. The investigated radar systems operate with similar system parameters to guarantee a certain comparability, but with different waveform durations, and chirps with different slopes and different phase code sequences, respectively. Since the effects in perfect synchrony are well understood, we focus on the cases where both systems exhibit a certain asynchrony. It is shown that the energy received from interferers can cluster in certain Doppler bins in the range-Doppler plane when systems exhibit a slight asynchrony.
Stretching exercises vs manual therapy in treatment of chronic neck pain: a randomized, controlled cross-over trial.
OBJECTIVE To compare the effects of manual therapy and stretching exercise on neck pain and disability. DESIGN An examiner-blinded randomized cross-over trial. PATIENTS A total of 125 women with non-specific neck pain. METHODS PATIENTS were randomized into 2 groups. Group 1 received manual therapy twice weekly and Group 2 performed stretching exercises 5 times a week. After 4 weeks the treatments were changed. The follow-up times were after 4 and 12 weeks. Neck pain (visual analogue scale) and disability indices were measured. RESULTS Mean value (standard deviation) for neck pain was 50 mm (22) and 49 mm (19) at baseline in Group 1 and Group 2, respectively, and decreased during the first 4 weeks by 26 mm (95% Confidence Interval 20-33) and 19 mm (12-27), respectively. There was no significant difference between groups. Neck and shoulder pain and disability index decreased significantly more in Group 1 after manual therapy (p=0.01) as well as neck stiffness (p=0.01). CONCLUSION Both stretching exercise and manual therapy considerably decreased neck pain and disability in women with non-specific neck pain. The difference in effectiveness between the 2 treatments was minor. Low-cost stretching exercises can be recommended in the first instance as an appropriate therapy intervention to relieve pain, at least in the short-term.
Platelet adenylyl cyclase signaling remains unaltered in children undergoing hemodialysis treatment
Patients with chronic renal failure exhibit multiple endocrine, gastrointestinal and cardiovascular abnormalities, many of which may be explained by alterations of adenylyl cyclase (AC) responsiveness and/or G-protein expression. Since such alterations were previously reported, e.g., for platelets of adult chronic renal failure patients undergoing hemodialysis treatment (HD), we have investigated whether children with chronic renal failure undergoing HD exhibit similar alterations. Eleven uremic children undergoing HD were compared with 11 age-matched healthy controls. Platelet AC activity was determined in the absence (basal) and presence of a receptor agonist, direct G-protein activators and direct AC stimulators. G-protein α-subunits were measured by quantitative immunoblotting. Basal and stimulated platelet AC and immunoreactivity for platelet G-protein α-subunits did not significantly differ between HD and control children. We conclude that HD in children is associated with much smaller, if any, abnormalities of blood cell signal transduction than in adult patients. We speculate that quality of dialysis, age, and underlying disease might differentially influence blood cell signal transduction cascades.
CS principles goes to middle school: learning how to teach "Big Data"
Spurred by evidence that students' future studies are highly influenced during middle school, recent efforts have seen a growing emphasis on introducing computer science to middle school learners. This paper reports on the in-progress development of a new middle school curricular module for Big Data, situated as part of a new CS Principles-based middle school curriculum. Big Data is of widespread societal importance and holds increasing implications for the computer science workforce. It also has appeal as a focus for middle school computer science because of its rich interplay with other important computer science principles. This paper examines three key aspects of a Big Data unit for middle school: its alignment with emerging curricular standards; the perspectives of middle school classroom teachers in mathematics, science, and language arts; and student feedback as explored during a middle school pilot study with a small subset of the planned curriculum. The results indicate that a Big Data unit holds great promise as part of a middle school computer science curriculum.
Clinically Significant Drug Interactions with Antacids
One may consider that drug-drug interactions (DDIs) associated with antacids is an obsolete topic because they are prescribed less frequently by medical professionals due to the advent of drugs that more effectively suppress gastric acidity (i.e. histamine H2-receptor antagonists [H2RAs] and proton pump inhibitors [PPIs]). Nevertheless, the use of antacids by ambulant patients may be ever increasing, because they are freely available as over-the-counter (OTC) drugs. Antacids consisting of weak basic substances coupled with polyvalent cations may alter the rate and/or the extent of absorption of concomitantly administered drugs via different mechanisms. Polyvalent cations in antacid formulations may form insoluble chelate complexes with drugs and substantially reduce their bioavailability. Clinical studies demonstrated that two classes of antibacterial s (tetracyclines and fluoroquinolones) are susceptible to clinically relevant DDIs with antacids through this mechanism. Countermeasures against this type of DDI include spacing out the dosing interval —taking antacid either 4 hours before or 2 hours after administration of these antibacterials. Bisphosphonates may be susceptible to DDIs with antacids by the same mechanism, as described in the prescription information of most bisphosphonates, but no quantitative data about the DDIs are available. For drugs with solubility critically dependent on pH, neutralization of gastric fluid by antacids may alter the dissolution of these drugs and the rate and/or extent of their absorption. However, the magnitude of DDIs elicited by antacids through this mechanism is less than that produced by H2RAs or PPIs; therefore, the clinical relevance of such DDIs is often obscure. Magnesium ions contained in some antacid formulas may increase gastric emptying, thereby accelerating the rate of absorption of some drugs. However, the clinical relevance of this is unclear in most cases because the difference in plasma drug concentration observed after dosing shortly disappears. Recent reports have indicated that some of the molecular-targeting agents such as the tyrosine kinase inhibitors dasatinib and imatinib, and the thrombopoietin receptor agonist eltrombopag may be susceptible to DDIs with antacids. Finally, the recent trend of developing OTC drugs as combination formulations of an antacid and an H2RA is a concern because these drugs will increase the risk of DDIs by dual mechanisms, i.e. a gastric pH-dependent mechanism by H2RAs and a cation-mediated chelation mechanism by antacids.
Fluid management of the neurological patient: a concise review.
Maintenance fluids in critically ill brain-injured patients are part of routine critical care. Both the amounts of fluid volumes infused and the type and tonicity of maintenance fluids are relevant in understanding the impact of fluids on the pathophysiology of secondary brain injuries in these patients. In this narrative review, current evidence on routine fluid management of critically ill brain-injured patients and use of haemodynamic monitoring is summarized. Pertinent guidelines and consensus statements on fluid management for brain-injured patients are highlighted. In general, existing guidelines indicate that fluid management in these neurocritical care patients should be targeted at euvolemia using isotonic fluids. A critical appraisal is made of the available literature regarding the appropriate amount of fluids, haemodynamic monitoring and which types of fluids should be administered or avoided and a practical approach to fluid management is elaborated. Although hypovolemia is bound to contribute to secondary brain injury, some more recent data have emerged indicating the potential risks of fluid overload. However, it is acknowledged that many factors govern the relationship between fluid management and cerebral blood flow and oxygenation and more research seems warranted to optimise fluid management and improve outcomes.
- 0-The Role of Trust in Customer Online Shopping Behavior : Perspective of Technology Acceptance Model
Online shopping, different from traditional shopping behavior, is characterized with uncertainty, anonymity, and lack of control and potential opportunism. Therefore, trust is an important factor to facilitate online transactions. The purpose of this study is to explore the role of trust in consumer online purchase behavior. This study undertook a comprehensive survey of online customers having e-shopping experiences in Taiwan and we received 1258 valid questionnaires. The empirical results, using structural equation modeling, indicated that perceived ease of use and perceived usefulness affect have a significant impact on trust in e-commerce. Trust also has a significant influence on attitude towards online purchase. However, there is no significant impact from trust on the intention of online purchase.
Online handwriting recognition for Tamil
A system for online recognition of handwritten Tamil characters is presented. A handwritten character is constructed by executing a sequence of strokes. A structure- or shape-based representation of a stroke is used in which a stroke is represented as a string of shape features. Using this string representation, an unknown stroke is identified by comparing it with a database of strokes using a flexible string matching procedure. A full character is recognized by identifying all the component strokes. Character termination, is determined using a finite state automaton. Development of similar systems for other Indian scripts is outlined.
Business Intelligence in the Cloud
Business Intelligence (BI) deals with integrated approaches to management support. Currently, there are constraints to BI adoption and a new era of analytic data management for business intelligence these constraints are the integrated infrastructures that are subject to BI have become complex, costly, and inflexible, the effort required consolidating and cleansing enterprise data and Performance impact on existing infrastructure / inadequate IT infrastructure. So, in this paper Cloud computing will be used as a possible remedy for these issues. We will represent a new environment atmosphere for the business intelligence to make the ability to shorten BI implementation windows, reduced cost for BI programs compared with traditional on-premise BI software, Ability to add environments for testing, proof-of-concepts and upgrades, offer users the potential for faster deployments and increased flexibility. Also, Cloud computing enables organizations to analyze terabytes of data faster and more economically than ever before. Business intelligence (BI) in the cloud can be like a big puzzle. Users can jump in and put together small pieces of the puzzle but until the whole thing is complete the user will lack an overall view of the big picture. In this paper reading each section will fill in a piece of the puzzle.
Neural Circuits Involved in the Recognition of Actions Performed by Nonconspecifics: An fMRI Study
Functional magnetic resonance imaging was used to assess the cortical areas active during the observation of mouth actions performed by humans and by individuals belonging to other species (monkey and dog). Two types of actions were presented: biting and oral communicative actions (speech reading, lip-smacking, barking). As a control, static images of the same actions were shown. Observation of biting, regardless of the species of the individual performing the action, determined two activation foci (one rostral and one caudal) in the inferior parietal lobule and an activation of the pars opercularis of the inferior frontal gyrus and the adjacent ventral premotor cortex. The left rostral parietal focus (possibly BA 40) and the left premotor focus were very similar in all three conditions, while the right side foci were stronger during the observation of actions made by conspecifics. The observation of speech reading activated the left pars opercularis of the inferior frontal gyrus, the observation of lip-smacking activated a small focus in the pars opercularis bilaterally, and the observation of barking did not produce any activation in the frontal lobe. Observation of all types of mouth actions induced activation of extrastriate occipital areas. These results suggest that actions made by other individuals may be recognized through different mechanisms. Actions belonging to the motor repertoire of the observer (e.g., biting and speech reading) are mapped on the observer's motor system. Actions that do not belong to this repertoire (e.g., barking) are essentially recognized based on their visual properties. We propose that when the motor representation of the observed action is activated, the observer gains knowledge of the observed action in a personal perspective, while this perspective is lacking when there is no motor activation.
Complete set of electromagnetic corrections to strongly interacting systems
We show how to obtain a complete set of electromagnetic corrections to a given nonperturbative model of strong interactions based on integral equations. The gauge invariance of these corrections is a consequence of their completeness.
Business process outsourcing studies: a critical review and research directions
Organizations are increasingly sourcing their business processes through external service providers, a practice known as Business Process Outsourcing (BPO). Worldwide, the current BPO market could be as much as $279 billion and is predicted to continue growing at 25% annually. Academic researchers have been studying this market for about 15 years and have produced findings relevant to practice. The entire body of BPO research has never been reviewed, and this paper fills that gap. We filtered the total studies and reviewed 87 empirically robust BPO articles published between 1996 and 2011 in 67 journals to answer three research questions: What has the empirical academic literature found about BPO decisions and outcomes? How do BPO findings compare with Information Technology Outsourcing (ITO) empirical research? What are the gaps in knowledge to consider in future BPO research? Employing a proven method that Lacity et al. (2010) used to review the empirical ITO literature, we encapsulated this empirical literature on BPO in a way that is concise, meaningful, and helpful to researchers. We coded 43 dependent variables, 152 independent variables, and 615 relationships between independent and dependent variables. By extracting the best evidence, we developed two models of BPO: one model addresses BPO decisions and one model addresses BPO outcomes. The model of BPO decisions includes independent variables associated with motives to outsource, transaction attributes, and client firm characteristics. The model of BPO outcomes includes independent variables associated with contractual and relational governance, country characteristics, and client and supplier capabilities. Overall, BPO researchers have a broad and deep understanding of BPO. However, the field continues to evolve as clients and suppliers on every inhabited continent participate actively in the global sourcing community. There is still much research yet to be done. We propose nine future paths of research pertaining to innovation effects, retained capabilities, environmental influences, global destinations, supplier capabilities, pricing models, business analytics, emerging models, and grounded theory development. Journal of Information Technology (2011) 26, 221–258. doi:10.1057/jit.2011.25 Published online 27 September 2011
Emotional intelligence : A meta-analytic investigation of predictive validity and nomological net
This study used meta-analytic techniques to examine the relationship between emotional intelligence (EI) and performance outcomes. A total of 69 independent studies were located that reported correlations between EI and performance or other variables such as general mental ability (GMA) and the Big Five factors of personality. Results indicated that, across criteria, EI had an operational validity of .23 (k 1⁄4 59, N 1⁄4 9522). Various moderating influences such as the EI measure used, dimensions of EI, scoring method and criterion were evaluated. EI correlated .22 with general mental ability (k 1⁄4 19, N 1⁄4 4158) and .23 (Agreeableness and Openness to Experience; k 1⁄4 14, N 1⁄4 3306) to .34 (Extraversion; k 1⁄4 19, N 1⁄4 3718) with the Big Five factors of personality. Results of various subgroup analyses are presented and implications and future directions are provided. 2003 Elsevier Inc. All rights reserved.
Inferior quantitative and qualitative immune responses to pneumococcal conjugate vaccine in infants with nasopharyngeal colonization by Streptococcus pneumoniae during the primary series of immunization.
BACKGROUND Heightened immunogenicity, measured one month after the primary series of pneumococcal conjugate vaccine (PCV), in African children was previously hypothesized to be due to increased rates of nasopharyngeal pneumococcal colonization during early infancy. METHODS We analyzed the effect of selected vaccine-serotype (6B, 19F and 23F) nasopharyngeal colonization prior to the first PCV dose or when colonized for the first time prior to the second or third (2nd/3rd) PCV dose on serotype quantitative and qualitative antibody responses. RESULTS Colonization prior to receiving the first PCV was associated with lower geometric mean antibody concentrations (GMCs) one month after the third dose of PCV and six months later to the colonizing-serotype. Colonized infants also had lower geometric mean titers (GMTs) on opsonophagocytosis activity assay (OPA) and a lower proportion had titers ≥ 8 against the colonizing serotypes (19F and 23F) post vaccination. Colonization occurring only prior to the 2nd/3rdPCV dose was also associated with lower GMCs and OPA GMTs to the colonizing-serotype. The effect of colonization with serotypes 19F and 23F prior to PCV vaccination had a greater effect on a lower proportion of colonized infants having OPA titers ≥ 8 than the effect of colonization on the lower proportion with antibody ≥ 0.35 μg/ml. CONCLUSION Infant nasopharyngeal colonization at any stage before completing the primary series of PCV vaccination was associated with inferior quantitative and qualitative antibody responses to the colonizing-serotype.
Accuracy trumps accent in children's endorsement of object labels.
Past research provides evidence that children use at least 2 potentially competing strategies when choosing informants: they attend to informants' past accuracy and to their social identity (e.g., their status as native- vs. foreign-accented speakers). We explore how children reconcile these 2 strategies when they are put in conflict and whether children's response changes across development. In Experiment 1 (N = 61), 3-, 4-, and 5-year-old children watched a native- and a foreign-accented English speaker label novel objects with novel names. All 3 age groups preferred the names provided by the native speaker. Next, 1 of the 2 speakers named familiar objects accurately, whereas the other speaker named them inaccurately. In a subsequent series of test trials, again with novel objects, 4- and 5-year-olds, but not 3-year-olds, were likely to endorse the names provided by the accurate speaker, regardless of her accent. In Experiment 2 (N = 72) 4-year-olds first watched a native- and a foreign-accented speaker name familiar objects, but the relative accuracy of the 2 speakers varied across conditions (100% vs. 0% correct; 75% vs. 25% correct). Subsequently, the 2 speakers provided novel names for novel objects. In each condition, 4-year-olds endorsed the names provided by the more accurate speaker, regardless of her accent. We propose that during the preschool years, children increasingly rely on past reliability when selecting informants.
A Hybrid Data Association Framework for Robust Online Multi-Object Tracking
Global optimization algorithms have shown impressive performance in data-association-based multi-object tracking, but handling online data remains a difficult hurdle to overcome. In this paper, we present a hybrid data association framework with a min-cost multi-commodity network flow for robust online multi-object tracking. We build local target-specific models interleaved with global optimization of the optimal data association over multiple video frames. More specifically, in the min-cost multi-commodity network flow, the target-specific similarities are online learned to enforce the local consistency for reducing the complexity of the global data association. Meanwhile, the global data association taking multiple video frames into account alleviates irrecoverable errors caused by the local data association between adjacent frames. To ensure the efficiency of online tracking, we give an efficient near-optimal solution to the proposed min-cost multi-commodity flow problem, and provide the empirical proof of its sub-optimality. The comprehensive experiments on real data demonstrate the superior tracking performance of our approach in various challenging situations.
Robust design of electrostatic separation processes
The aim of this paper is to analyze the robustness of the electrostatic separation process control. The objective was to reduce variation in the process outcome by finding operating conditions (high-voltage level, roll speed), under which uncontrollable variation in the noise factors (granule size, composition of the material to be separated) has minimal impact on the quantity (and the quality) of the recovered products. The experiments were carried out on a laboratory roll-type electrostatic separator, provided with a corona electrode and a tubular electrode, both connected to a dc high-voltage supply. The samples of processed material were prepared from genuine chopped electric wire wastes (granule size >1 mm and <5 mm) containing various proportions of copper and PVC. The design and noise factors were combined into one single experimental design, based on Taguchi's approach, and a regression model of the process was fitted. The impact of the noise factors could be estimated, as well as the interactions between the design and noise factors. The conditions of industry application of Taguchi's methodology are discussed, as well as the possibility of adapting it to other electrostatic processes.
Pathogen recognition by the innate immune system.
Microbial infection initiates complex interactions between the pathogen and the host. Pathogens express several signature molecules, known as pathogen-associated molecular patterns (PAMPs), which are essential for survival and pathogenicity. PAMPs are sensed by evolutionarily conserved, germline-encoded host sensors known as pathogen recognition receptors (PRRs). Recognition of PAMPs by PRRs rapidly triggers an array of anti-microbial immune responses through the induction of various inflammatory cytokines, chemokines and type I interferons. These responses also initiate the development of pathogen-specific, long-lasting adaptive immunity through B and T lymphocytes. Several families of PRRs, including Toll-like receptors (TLRs), RIG-I-like receptors (RLRs), NOD-like receptors (NLRs), and DNA receptors (cytosolic sensors for DNA), are known to play a crucial role in host defense. In this review, we comprehensively review the recent progress in the field of PAMP recognition by PRRs and the signaling pathways activated by PRRs.
Platelets enhance tissue factor protein and metastasis initiating cell markers, and act as chemoattractants increasing the migration of ovarian cancer cells
An increase in circulating platelets, or thrombocytosis, is recognized as an independent risk factor of bad prognosis and metastasis in patients with ovarian cancer; however the complex role of platelets in tumor progression has not been fully elucidated. Platelet activation has been associated with an epithelial to mesenchymal transition (EMT), while Tissue Factor (TF) protein expression by cancer cells has been shown to correlate with hypercoagulable state and metastasis. The aim of this work was to determine the effect of platelet-cancer cell interaction on TF and “Metastasis Initiating Cell (MIC)” marker levels and migration in ovarian cancer cell lines and cancer cells isolated from the ascetic fluid of ovarian cancer patients. With informed patient consent, ascitic fluid isolated ovarian cancer cells, cell lines and ovarian cancer spheres were co-cultivated with human platelets. TF, EMT and stem cell marker levels were determined by Western blotting, flow cytometry and RT-PCR. Cancer cell migration was determined by Boyden chambers and the scratch assay. The co-culture of patient-derived ovarian cancer cells with platelets causes: 1) a phenotypic change in cancer cells, 2) chemoattraction and cancer cell migration, 3) induced MIC markers (EMT/stemness), 3) increased sphere formation and 4) increased TF protein levels and activity. We present the first evidence that platelets act as chemoattractants to cancer cells. Furthermore, platelets promote the formation of ovarian cancer spheres that express MIC markers and the metastatic protein TF. Our results suggest that platelet-cancer cell interaction plays a role in the formation of metastatic foci.
MV-YOLO: Motion Vector-Aided Tracking by Semantic Object Detection
Object tracking is the cornerstone of many visual analytics systems. While considerable progress has been made in this area in recent years, robust, efficient, and accurate tracking in real-world video remains a challenge. In this paper, we present a hybrid tracker that leverages motion information from the compressed video stream and a general-purpose semantic object detector acting on decoded frames to construct a fast and efficient tracking engine. The proposed approach is compared with several well-known recent trackers on the OTB tracking dataset. The results indicate advantages of the proposed method in terms of speed and/or accuracy. Other desirable features of the proposed method are its simplicity and deployment efficiency, which stems from the fact that it reuses the resources and information that may already exist in the system for other reasons.
Automatic Question Generation from Queries
With increasing popularity of communitybased question answering services such as Yahoo! Answers, huge amounts of user generated questions are available. In this paper, we propose using these data along with search engine query logs to create a question generation shared task that aims to automatically generate questions given a query. At least two benefits could result from such a task. First, it can be used to rank possible candidate questions; therefore, it would enable search of question answer archives. Second, it could be used as a more informative and richer way of query expansion to guide or aid search.
A new approach to identify power transformer criticality and asset management decision based on dissolved gas-in-oil analysis
Dissolved gas analysis (DGA) of transformer oil is one of the most effective power transformer condition monitoring tools. There are many interpretation techniques for DGA results. However, all of these techniques rely on personnel experience more than standard mathematical formulation and significant number of DGA results fall outside the proposed codes of the current methods and cannot be diagnosed by these methods. To overcome these limitations, this paper introduces a novel approach using Gene Expression Programming (GEP) to help in standardizing DGA interpretation techniques, identify transformer critical ranking based on DGA results and propose a proper maintenance action. DGA has been performed on 338 oil samples that have been collected from different transformers of different rating and different life span. Traditional DGA interpretation techniques are used in analyzing the results to measure its consistency. These data are then used to develop the new GEP model. Results show that all current traditional techniques do not necessarily lead to the same conclusion for the same oil sample. The new approach using GEP is easy to implement and it does not call for any expert personnel to interpret the DGA results and to provide a proper asset management decision on the transformer based on DGA analysis.
Kant's critique of right
This article has two objectives: first, to bring to the fore Kant’s neglected distinction between ‘critique’ and ‘doctrine’ and, second, to relate this distinction to Kant’s notion of a philosophy of right. Kant’s culminating contribution to practical philosophy, the Metaphysics of Morals, contains a doctrine of right and this ‘doctrine’ has received relatively little attention thus far in English-language writing on Kant. One of the reasons for this relative neglect is, I believe, due to the prevalent attention provided to Kant’s practical critique at the expense of his practical doctrine. I aim to provide an account of Kant’s critique of right in order to enable an understanding of Kant’s doctrine of right to be provided with some initial orientation. I will be suggesting that this critique of right is presented in Toward Perpetual Peace: A Philosophical Sketch.
Peroxidase activity, isoenzymes and histological localisation in sapwood and heartwood of Scots pine (Pinus sylvestris L.)
Peroxidase activity and isoenzymes of fresh wood samples of the third shoot of 12-year old trees and from the sapwood, transition zone and heartwood of c. 60-year old stems of Scots pine (Pinus sylvestris L.) were investigated. Wood samples were ground at −30°C, extracted, and the extracts concentrated c. 20-fold for peroxidase activity assays (guaiacol method) and for IEF-PAGE. At least 11 major isoenzymes could be found in the gels. Even the heartwood contained some peroxidase isoenzymes. Isoenzyme patterns of the juvenile wood did not change with the season. However, juvenile wood showed the highest peroxidase activity at the end of the growing season. Peroxidase activity decreased from the outer sapwood towards the heartwood. Thin sections of different wood zones stained for peroxidase revealed activity in ray parenchyma and resin canal epithelial cells. Intensive staining was localised in the bordered pits of vertical and ray tracheids, and in the end walls of ray parenchyma cells.
Inducing gene expression of cardiac antioxidant enzymes by dietary phenolic acids in rats.
An increase in oxidative stress is suggested to be intimately involved in the pathogenesis of heart failure. Phenolic acids are widespread in plant foods; they contain important biological and pharmacological properties. This study evaluated the role of phenolic acids on the expression of antioxidant enzymes in the heart of male Sprague-Dawley rats. Gallic acid, ferulic acid and p-coumaric acid at a dosage of 100 mg kg(-1) body weight significantly increased the activities of cardiac superoxide dismutase, glutathione peroxidase (GPx) and catalase (CAT) as compared with control rats (P<.05). The changes in cardiac CuZnSOD, GPx and CAT mRNA levels induced by phenolic acids were similar to those noted in the enzyme activity levels. A significant (P<.05) increase in the GSH/GSSG ratio was observed in the heart of phenolic acid-treated rats. The heart homogenates obtained from rats that were administered phenolic acids displayed significant (P<.05) increases in capacity for oxygen radical absorbance compared with control rats. Immunoblot analysis revealed the increased cardiac total level of Nrf2 in phenolic acid-treated rats. Interestingly, phenolic acid-mediated antioxidant enzyme expression was accompanied by up-regulation of heme oxygenase-1. This study demonstrates that antioxidant enzymes in rat cardiac tissue can be significantly induced by phenolic acids following oral administration.
Differential neural circuitry and self-interest in real vs hypothetical moral decisions
Classic social psychology studies demonstrate that people can behave in ways that contradict their intentions--especially within the moral domain. We measured brain activity while subjects decided between financial self-benefit (earning money) and preventing physical harm (applying an electric shock) to a confederate under both real and hypothetical conditions. We found a shared neural network associated with empathic concern for both types of decisions. However, hypothetical and real moral decisions also recruited distinct neural circuitry: hypothetical moral decisions mapped closely onto the imagination network, while real moral decisions elicited activity in the bilateral amygdala and anterior cingulate--areas essential for social and affective processes. Moreover, during real moral decision-making, distinct regions of the prefrontal cortex (PFC) determined whether subjects make selfish or pro-social moral choices. Together, these results reveal not only differential neural mechanisms for real and hypothetical moral decisions but also that the nature of real moral decisions can be predicted by dissociable networks within the PFC.
Evaluation of Electronic Cigarette Use (Vaping) Topography and Estimation of Liquid Consumption: Implications for Research Protocol Standards Definition and for Public Health Authorities’ Regulation
BACKGROUND Although millions of people are using electronic cigarettes (ECs) and research on this topic has intensified in recent years, the pattern of EC use has not been systematically studied. Additionally, no comparative measure of exposure and nicotine delivery between EC and tobacco cigarette or nicotine replacement therapy (NRTs) has been established. This is important, especially in the context of the proposal for a new Tobacco Product Directive issued by the European Commission. METHODS A second generation EC device, consisting of a higher capacity battery and tank atomiser design compared to smaller cigarette-like batteries and cartomizers, and a 9 mg/mL nicotine-concentration liquid were used in this study. Eighty subjects were recruited; 45 experienced EC users and 35 smokers. EC users were video-recorded when using the device (ECIG group), while smokers were recorded when smoking (SM-S group) and when using the EC (SM-E group) in a randomized cross-over design. Puff, inhalation and exhalation duration were measured. Additionally, the amount of EC liquid consumed by experienced EC users was measured at 5 min (similar to the time needed to smoke one tobacco cigarette) and at 20 min (similar to the time needed for a nicotine inhaler to deliver 4 mg nicotine). RESULTS Puff duration was significantly higher in ECIG (4.2 ± 0.7 s) compared to SM-S (2.1 ± 0.4 s) and SM-E (2.3 ± 0.5 s), while inhalation time was lower (1.3 ± 0.4, 2.1 ± 0.4 and 2.1 ± 0.4 respectively). No difference was observed in exhalation duration. EC users took 13 puffs and consumed 62 ± 16 mg liquid in 5 min; they took 43 puffs and consumed 219 ± 56 mg liquid in 20 min. Nicotine delivery was estimated at 0.46 ± 0.12 mg after 5 min and 1.63 ± 0.41 mg after 20 min of use. Therefore, 20.8 mg/mL and 23.8 mg/mL nicotine-containing liquids would deliver 1 mg of nicotine in 5 min and 4 mg nicotine in 20 min, respectively. Since the ISO method significantly underestimates nicotine delivery by tobacco cigarettes, it seems that liquids with even higher than 24 mg/mL nicotine concentration would be comparable to one tobacco cigarette. CONCLUSIONS EC use topography is significantly different compared to smoking. Four-second puffs with 20-30 s interpuff interval should be used when assessing EC effects in laboratory experiments, provided that the equipment used does not get overheated. Based on the characteristics of the device used in this study, a 20 mg/mL nicotine concentration liquid would be needed in order to deliver nicotine at amounts similar to the maximum allowable content of one tobacco cigarette (as measured by the ISO 3308 method). The results of this study do not support the statement of the European Commission Tobacco Product Directive that liquids with nicotine concentration of 4 mg/mL are comparable to NRTs in the amount of nicotine delivered to the user.
New Wideband Transition From Microstrip Line to Substrate Integrated Waveguide
A new wideband transition from microstrip line to substrate integrated waveguide (SIW) is introduced. Unlike most transitions that show reduced return loss over significant parts of a regular waveguide band, the presented configuration achieves return losses better than 30 dB in standard waveguide frequency bands from X to E. The new aspect of this transition is the addition of two vias to the widely used microstrip taper transition. Moreover, the influence of the substrate height is demonstrated. The results in each frequency band are compared with the data for the regular microstrip taper alone. A design formula for the placement of the vias and taper dimensions is presented and demonstrated to provide excellent results. The structures are simulated and optimized with CST Microwave Studio. Measurements performed on a Ku-band back-to-back prototype transition demonstrate a minimum return loss of 26.05 dB and maximum insertion loss of 0.821 dB over the entire Ku-band, thus validating the design approach.
Dynamic Analysis of a Nonholonomic Two-Wheeled Inverted Pendulum Robot
As a result of the increase in robots in various fields, the mechanical stability of specific robots has become an important subject of research. This study is concerned with the development of a two-wheeled inverted pendulum robot that can be applied to an intelligent, mobile home robot. This kind of robotic mechanism has an innately clumsy motion for stabilizing the robot’s body posture. To analyze and execute this robotic mechanism, we investigated the exact dynamics of the mechanism with the aid of 3-DOF modeling. By using the governing equations of motion, we analyzed important issues in the dynamics of a situation with an inclined surface and also the effect of the turning motion on the stability of the robot. For the experiments, the mechanical robot was constructed with various sensors. Its application to a two-dimensional floor environment was confirmed by experiments on factors such as balancing, rectilinear motion, and spinning motion.
Double-blind, placebo-controlled, randomized phase II study of TJ-14 (Hangeshashinto) for infusional fluorinated-pyrimidine-based colorectal cancer chemotherapy-induced oral mucositis
Hangeshashinto (TJ-14, a Kampo medicine), which reduces the level of prostaglandin E2 and affects the cyclooxygenase activity, alleviates chemotherapy-induced oral mucositis (COM). We conducted a double-blind, placebo-controlled, randomized comparative trial to investigate whether TJ-14 prevents and controls COM in patients with colorectal cancer. Ninety-three patients with colorectal cancer who developed moderate-to-severe COM (WHO grade ≧1) during any cycle of chemotherapy using FOLFOX, FOLFIRI, and/or XELOX treatment were randomly assigned to receive either TJ-14 (n = 46) or placebo (n = 47). Patients received the administration of placebo or TJ-14 for 2 weeks at the start of the next course of chemotherapy. Patients were assessed three times per week for safety and for COM incidence and its severity using the WHO grading. Ninety eligible patients (TJ-14; 43, placebo; 47) per protocol set analysis were included in the analysis after the key-opening. Although the incidence of grade ≧2 oral mucositis was lower for patients treated with TJ-14 compared to those treated with placebo, there was no significant difference (48.8 vs. 57.4 %; p = 0.41). The median duration of grade ≧2 mucositis was 5.5 versus 10.5 days (p = 0.018). No difference in other treatment toxicity was observed between the two groups, and patients exhibited high compliance in dosing administration. The present study results did not meet the primary endpoint. However, TJ-14 demonstrated a significant effect in the treatment of grade ≧2 mucositis in patients with colorectal cancer compared to the placebo.