Dataset Preview Go to dataset viewer
id (string)title (string)abstract (string)keyphrases (json)prmu (json)
Towards a NMR implementation of a quantum lattice gas algorithm
Recent theoretical results suggest that an array of quantum information processors communicating via classical channels can be used to solve fluid dynamics problems. Quantum lattice-gas algorithms (QLGA) running on such architectures have been shown to solve the diffusion equation and the nonlinear Burgers equations. In this report, we describe progress towards an ensemble nuclear magnetic resonance (NMR) implementation of a QLGA that solves the diffusion equation. The methods rely on NMR techniques to encode an initial mass density into an ensemble of two-qubit quantum information processors. Using standard pulse techniques, the mass density can then manipulated and evolved through the steps of the algorithm. We provide the experimental results of our first attempt to realize the NMR implementation. The results qualitatively follow the ideal simulation, but the observed implementation errors highlight the need for improved control
[ "NMR implementation", "quantum lattice gas algorithm", "quantum information processors", "fluid dynamics problems", "diffusion equation", "nonlinear Burgers equations", "nuclear magnetic resonance", "two-qubit quantum information.processors" ]
[ "P", "P", "P", "P", "P", "P", "P", "M" ]
Banking on SMA funds [separately managed accounts]
From investment management to technology to back-office services, outsourcers are elbowing their way into the SMA business. Small banks are paying attention-and hoping to reap the rewards
[ "separately managed accounts", "investment management", "technology", "back-office services", "outsourcers", "small banks" ]
[ "P", "P", "P", "P", "P", "P" ]
Design methodology for diagnostic strategies for industrial systems
This paper presents a method for the construction of diagnostic systems for complex industrial applications. The approach has been explicitely developed to shorten the design cycle and meet some specific requirements, such as modularity, flexibility, and the possibility of merging many different sources of information. The method allows one to consider multiple simultaneous failures and is specifically designed to make easier the coordination and simplification of local diagnostic algorithms developed by different teams
[ "design methodology", "diagnostic strategies", "industrial systems", "modularity", "local diagnostic algorithms" ]
[ "P", "P", "P", "P", "P" ]
A shy invariant of graphs
Moving from a well known result of P.L. Hammer et al. (1982), we introduce a new graph invariant, say lambda (G) referring to any graph G. It is a non-negative integer which is non-zero whenever G contains particular induced odd cycles or, equivalently, admits a particular minimum clique-partition. We show that).(G) can be efficiently evaluated and that its determination allows one to reduce the hard problem of computing a minimum clique-cover of a graph to an identical problem of smaller size and special structure. Furthermore, one has alpha (G) <or= theta (G) - lambda (G), where alpha (G) and theta (G) respectively denote the cardinality of a maximum stable set of G and of a minimum clique-partition of G
[ "graph invariant", "induced odd cycles", "minimum clique-partition", "minimum clique-cover", "cardinality", "maximum stable set" ]
[ "P", "P", "P", "P", "P", "P" ]
PacketVideo. One step ahead of the streaming wireless market
Go beyond the hype, however, and it's clear that PacketVideo is making strides in delivering streaming multimedia content to wireless devices. For one thing, its technology, based on the industry-standard Motion Pictures Expert Group 4 (MPEG-4) video encoder/decoder, actually works as promised. Secondly, the company has forged a broad-based band of alliances that not only will eventually help it reach potential customers down the road, but provides it financial support until the company can ramp up sales. The list of PacketVideo's technology partners who are also investors-and who have pumped more than $121 million into the company-includes not just wireless device manufacturers, but content providers and semiconductor vendors, all of whom stand to benefit by increased sales of handheld wireless terminals
[ "PacketVideo", "wireless devices", "MPEG-4", "wireless device manufacturers", "content providers", "semiconductor vendors", "handheld wireless terminals", "multimedia content streaming" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
Universal simulation of Hamiltonian dynamics for quantum systems with finite-dimensional state spaces
What interactions are sufficient to simulate arbitrary quantum dynamics in a composite quantum system? Dodd et al. [Phys. Rev. A 65, 040301(R) (2002)] provided a partial solution to this problem in the form of an efficient algorithm to simulate any desired two-body Hamiltonian evolution using any fixed two-body entangling N-qubit Hamiltonian, and local unitaries. We extend this result to the case where the component systems are qudits, that is, have D dimensions. As a consequence we explain how universal quantum computation can be performed with any fixed two-body entangling N-qudit Hamiltonian, and local unitaries
[ "universal simulation", "Hamiltonian dynamics", "quantum systems", "quantum dynamics", "composite quantum system", "two-body Hamiltonian evolution", "fixed two-body entangling N-qubit Hamiltonian", "local unitaries", "universal quantum computation", "fixed two-body entangling N-qudit Hamiltonian", "finite- dimensional state spaces", "D-dimensional component systems" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M" ]
H/sub 2/ optimization of the three-element type dynamic vibration absorbers
The dynamic vibration absorber (DVA) is a passive vibration control device which is attached to a vibrating body (called a primary system) subjected to exciting force or motion. In this paper, we will discuss an optimization problem of the three-element type DVA on the basis of the H/sub 2/ optimization criterion. The objective of the H/sub 2/ optimization is to reduce the total vibration energy of the system for overall frequencies; the total area under the power spectrum response curve is minimized in this criterion. If the system is subjected to random excitation instead of sinusoidal excitation, then the H/sub 2/ optimization is probably more desirable than the popular H/sub infinity / optimization. In the past decade there has been increasing interest in the three-element type DVA. However, most previous studies on this type of DVA were based on the H/sub infinity / optimization design, and no one has been able to find the algebraic solution as of yet. We found a closed-form exact solution for a special case where the primary system has no damping. Furthermore, the general case solution including the damped primary system is presented in the form of a numerical solution. The optimum parameters obtained here are compared to those of the conventional Voigt type DVA. They are also compared to other optimum parameters based on the H/sub infinity / criterion
[ "H/sub 2/ optimization", "three-element type dynamic vibration absorbers", "passive vibration control", "power spectrum response", "Voigt type dynamic vibration absorber" ]
[ "P", "P", "P", "P", "R" ]
The semi-algebraic theory of stochastic games
The asymptotic behavior of the min-max value of a finite-state zero-sum discounted stochastic game, as the discount rate approaches 0, has been studied in the past using the theory of real-closed fields. We use the theory of semi-algebraic sets and mappings to prove some asymptotic properties of the min-max value, which hold uniformly for all stochastic games in which the number of states and players' actions are predetermined to some fixed values. As a corollary, we prove a uniform polynomial convergence rate of the value of the N-stage game to the value of the nondiscount game, over a bounded set of payoffs
[ "asymptotic behavior", "min-max value", "finite-state zero-sum discounted stochastic game", "discount rate", "uniform polynomial convergence rate", "N-stage game", "semi-algebraic set theory", "two-player zero-sum finite-state stochastic games" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
Polarization of the RF field in a human head at high field: a study with a quadrature surface coil at 7.0 T
The RF field intensity distribution in the human brain becomes inhomogeneous due to wave behavior at high field. This is further complicated by the spatial distribution of RF field polarization that must be considered to predict image intensity distribution. An additional layer of complexity is involved when a quadrature coil is used for transmission and reception. To study such complicated RF field behavior, a computer modeling method was employed to investigate the RF field of a quadrature surface coil at 300 MHz. Theoretical and experimental results for a phantom and the human head at 7.0 T are presented. The results are theoretically important and practically useful for high-field quadrature coil design and application
[ "quadrature surface coil", "7.0 T", "RF field intensity distribution", "human brain", "spatial distribution", "RF field polarization", "image intensity distribution", "computer modeling", "300 MHz", "high field MRI", "high-field coil design", "whole-body MRI", "phantom samples", "segmented images", "3D multitissue head model", "gradient echo images", "finite difference time domain method", "Maxwell wave equations", "reception fields", "transmission fields" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R", "U", "M", "M", "M", "M", "M", "M", "R", "R" ]
Construction of information retrieval thesaurus for family planning terms using CDS/ISIS
The thesaurus as a tool for information retrieval and as an alternative to the existing scheme of classifications in information retrieval is discussed. The paper considers the emergence of the information retrieval thesaurus and its definition. Family planning is a multidisciplinary subject covering socio economic, cultural, psychological and medical fields. This necessitated the construction of a thesaurus for the Family Planning discipline. The construction is based on UNISIST, ISO 2788 and BS 5723 guidelines by using CDS/ISIS software
[ "information retrieval", "thesaurus", "family planning terms", "Family Planning", "classification", "culture", "psychology", "UNISIST", "ISO 2788", "BS 5723", "CDS/ISIS software", "bibliographic databases", "socio economic field", "medicine" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "R", "U" ]
Taking back control [SCADA system]
Most common way to implement a SCADA system is to go outside. However, in the author's opinion, to truly take control of a SCADA project, in-house personnel should handle as much of the job as possible. This includes design, equipment specification, installation, and programming. The more of these tasks one does in-house, the more control and ownership one has. To accomplish this, we first evaluated the existing SCADA system and investigated new technologies to establish a list of features the new system needed to incorporate
[ "SCADA", "supervisory control", "data acquisition", "in-house integration", "compatibility", "programmable logic controllers" ]
[ "P", "M", "U", "M", "U", "M" ]
Dot-Net makes slow progress
Microsoft's Windows .Net Enterprise Server Release Candidate I, which was released at the end of last month, provides an early glimpse of the system that will eventually replace Windows 200 Advanced Server. The software has been improved so that Active Directory is more flexible and easier to deploy; and security, scalability and management have also been enhanced
[ "Windows .Net Enterprise Server", "Active Directory", "security", "scalability" ]
[ "P", "P", "P", "P" ]
Distributed servers approach for large-scale secure multicast
In order to offer backward and forward secrecy for multicast applications (i.e., a new member cannot decrypt the multicast data sent before its joining and a former member cannot decrypt the data sent after its leaving), the data encryption key has to be changed whenever a user joins or leaves the system. Such a change has to be made known to all the current users. The bandwidth used for such re-key messaging can be high when the user pool is large. We propose a distributed servers approach to minimize the overall system bandwidth (and complexity) by splitting the user pool into multiple groups each served by a (logical) server. After presenting an analytic model for the system based on a hierarchical key tree, we show that there is an optimal number of servers to achieve minimum system bandwidth. As the underlying user traffic fluctuates, we propose a simple dynamic scheme with low overhead where a physical server adaptively splits and merges its traffic into multiple groups each served by a logical server so as to minimize its total bandwidth. Our results show that a distributed servers approach is able to substantially reduce the total bandwidth required as compared with the traditional single-server approach, especially for those applications with a large user pool, short holding time, and relatively low bandwidth of a data stream, as in the Internet stock quote applications
[ "distributed servers", "large-scale secure multicast", "forward secrecy", "multicast applications", "data encryption key", "re-key messaging", "system bandwidth", "hierarchical key tree", "user traffic", "short holding time", "Internet stock quote applications", "backward secrecy", "system complexity", "traffic merging", "dynamic split-and-merge scheme", "key management" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "M", "M" ]
Cooperative three- and four-player quantum games
A cooperative multi-player quantum game played by 3 and 4 players has been studied. A quantum superposed operator is introduced in this work which solves the non-zero sum difficulty in previous treatments. The role of quantum entanglement of the initial state is discussed in detail
[ "quantum superposed operator", "quantum entanglement", "initial state", "cooperative three-player quantum games", "cooperative four-player quantum games", "nonzero sum difficulty" ]
[ "P", "P", "P", "M", "R", "M" ]
A Blog in every law firm?
You don't know today what you'll want to know next year. Rather than trying to solve that problem, focus on providing simple tools to users that create valuable content across the firm. Individual contributions will be more visible, and you will have a searchable archive of your institutional memory and a simplified process for ensuring everyone is up to speed. Whether you embrace weblogs for their individual or institutional benefits, one thing is certain: They will become powerful tools for those who seek ways to more efficiently and intelligently manage information
[ "law firm", "institutional memory", "weblogs", "Web site" ]
[ "P", "P", "P", "U" ]
Grey-box model identification via evolutionary computing
This paper presents an evolutionary grey-box model identification methodology that makes the best use of a priori knowledge on a clear-box model with a global structural representation of the physical system under study, whilst incorporating accurate blackbox models for immeasurable and local nonlinearities of a practical system. The evolutionary technique is applied to building dominant structural identification with local parametric tuning without the need of a differentiable performance index in the presence of noisy data. It is shown that the evolutionary technique provides an excellent fitting performance and is capable of accommodating multiple objectives such as to examine the relationships between model complexity and fitting accuracy during the model building process. Validation results show that the proposed method offers robust, uncluttered and accurate models for two practical systems. It is expected that this type of grey-box models will accommodate many practical engineering systems for a better modelling accuracy
[ "grey-box models", "system identification", "evolutionary algorithms", "genetic evolution", "multiobjective optimisation", "hydraulic system", "nonlinear system" ]
[ "P", "R", "M", "U", "U", "M", "R" ]
A genetic approach to the optimization of automatic generation control parameters for power systems
This paper presents a method based on genetic algorithm for the automatic generation control of power systems. The technique is applied to control a system, which includes two areas tied together through a power line. As a consequence of continuous load variation, the frequency of the power system changes with time. In conventional studies, frequency transients are minimized by using integral controllers and thus zero steady-state error is obtained. In this paper, integral controller gains and frequency bias factors are determined by using the genetic algorithm. The results of simulation reveal the application of the genetic algorithm having easy implementation to find the global optimum values of the control parameters
[ "genetic algorithm", "power line", "continuous load variation", "frequency transients", "integral controller gains", "frequency bias factors", "power systems automatic generation control parameters optimization", "control design", "interconnected power networks", "control simulation" ]
[ "P", "P", "P", "P", "P", "P", "R", "M", "M", "R" ]
Buying into the relationship [business software]
Choosing the right software to improve business processes can have a huge impact on a company's efficiency and profitability. While it is sometimes hard to get beyond vendor hype about software features and functionality and know what to realistically expect, it is even more difficult to determine if the vendor is the right vendor to partner with. Thus picking the right software is important, but companies have to realize that what they are really buying into is a relationship with the vendor
[ "business software", "functionality", "vendor relationship", "management", "software evaluation" ]
[ "P", "P", "R", "U", "M" ]
Multi-agent collaboration for B2B workflow monitoring
Business-to-business (B2B) application environments are exceedingly dynamic and competitive. This dynamism is manifested in the form of changing process requirements and time constraints. However, current workflow management technologies have difficulties trying to solve problems, such as: how to deal with the dynamic nature of B2B commerce processes, how to manage the distributed knowledge and recourses, and how to reduce the transaction risk. In this paper, a collaborative multi-agent system is proposed. Multiple intelligent agents in our system can work together not only to identify the workflow problems, but also to solve such problems, by applying business rules, such as re-organizing the procurement and the transaction processes, and making necessary workflow process changes
[ "multi-agent collaboration", "B2B workflow monitoring", "changing process requirements", "time constraints", "workflow management", "transaction risk", "business rules", "Internet", "business-to-business applications", "electronic commerce" ]
[ "P", "P", "P", "P", "P", "P", "P", "U", "R", "M" ]
A synergic analysis for Web-based enterprise resources planning systems
As the central nervous system for managing an organization's mission and critical business data, Enterprise Resource Planning (ERP) system has evolved to become the backbone of e-business implementation. Since an ERP system is multimodule application software that helps a company manage its important business functions, it should be versatile enough to automate every aspect of business processes, including e-business
[ "synergic analysis", "Web-based enterprise resources planning", "Enterprise Resource Planning", "ERP", "e-business", "customer relationship management" ]
[ "P", "P", "P", "P", "P", "M" ]
SRP rolls out reliability and asset management initiative
Reliability planning analysis at the Salt River Project (SRP, Tempe, Arizona, US) prioritizes geographic areas for preventive inspections based on a cost benefit model. However, SRP wanted a new application system to prioritize inspections and to predict when direct buried cable would fail using the same cost benefit model. In the business cases, the represented type of kilowatt load-residential, commercial or critical circuit-determines the cost benefit per circuit. The preferred solution was to develop a geographical information system (GIS) application allowing for a circuit query for the specific geographic areas it crosses and the density of load points of a given type within those areas. The query returns results based on the type of equipment analysis execution: wood pole, preventive maintenance for a line or cable replacement. This differentiation insures that all the facilities relevant to a specific analysis type influence prioritization of the geographic areas
[ "reliability planning analysis", "Salt River Project", "Tempe", "Arizona", "geographic areas", "preventive inspections", "cost benefit model", "direct buried cable", "geographical information system", "GIS", "equipment analysis execution", "wood pole", "cable replacement", "USA", "condition monitoring" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U" ]
Multiecho segmented EPI with z-shimmed background gradient compensation (MESBAC) pulse sequence for fMRI
A MultiEcho Segmented EPI with z-shimmed BAckground gradient Compensation (MESBAC) pulse sequence is proposed and validated for functional MRI (fMRI) study in regions suffering from severe susceptibility artifacts. This sequence provides an effective tradeoff between spatial and temporal resolution and reduces image distortion and signal dropout. The blood oxygenation level-dependent (BOLD)-weighted fMRI signal can be reliably obtained in the region of the orbitofrontal cortex (OFC). To overcome physiological motion artifacts during prolonged multisegment EPI acquisition, two sets of navigator echoes were acquired in both the readout and phase-encoding directions. Ghost artifacts generally produced by single-shot EPI acquisition were eliminated by separately placing the even and odd echoes in different k-space trajectories. Unlike most z-shim methods that focus on increasing temporal resolution for event-related functional brain mapping, the MESBAC sequence simultaneously addresses problems of image distortion and signal dropout while maintaining sufficient temporal resolution. The MESBAC sequence will be particularly useful for pharmacological and affective fMRI studies in brain regions such as the OFC, nucleus accumbens, amygdala, para-hippocampus, etc
[ "multiecho segmented EPI", "z-shimmed background gradient compensation", "fMRI", "severe susceptibility artifacts", "temporal resolution", "image distortion", "signal dropout", "orbitofrontal cortex", "navigator echoes", "ghost artifacts", "event-related functional brain mapping", "gradient compensation pulse sequence", "BOLD-weighted signal", "spatial resolution" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R" ]
Phase conditions for Schur polynomials
The rate of change of phase of a real or complex Schur polynomial, evaluated along the unit circle traversed counterclockwise, is strictly positive. For polynomials with real coefficients, this bound can be tightened. These and some other fundamental bounds on the rate of change of phase are derived here, using the Tchebyshev representation of the image of a real polynomial evaluated on the unit circle
[ "phase conditions", "Schur polynomial", "rate of change of phase", "real coefficients", "Tchebyshev representation", "phase monotonicity", "robust stability", "discrete-time control systems", "stabilization" ]
[ "P", "P", "P", "P", "P", "M", "U", "U", "U" ]
Four-point wavelets and their applications
Multiresolution analysis (MRA) and wavelets provide useful and efficient tools for representing functions at multiple levels of details. Wavelet representations have been used in a broad range of applications, including image compression, physical simulation and numerical analysis. In this paper, the authors construct a new class of wavelets, called four-point wavelets, based on an interpolatory four-point subdivision scheme. They are of local support, symmetric and stable. The analysis and synthesis algorithms have linear time complexity. Depending on different weight parameters w, the scaling functions and wavelets generated by the four-point subdivision scheme are of different degrees of smoothness. Therefore the user can select better wavelets relevant to the practice among the classes of wavelets. The authors apply the four-point wavelets in signal compression. The results show that the four-point wavelets behave much better than B-spline wavelets in many situations
[ "four-point wavelets", "multiresolution analysis", "wavelet representations", "image compression", "physical simulation", "numerical analysis", "interpolatory four-point subdivision scheme", "linear time complexity", "weight parameters", "scaling functions", "B-spline wavelets" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
Effect of multileaf collimator leaf width on physical dose distributions in the treatment of CNS and head and neck neoplasms with intensity modulated radiation therapy
The purpose of this work is to examine physical radiation dose differences between two multileaf collimator (MLC) leaf widths (5 and 10 mm) in the treatment of CNS and head and neck neoplasms with intensity modulated radiation therapy (IMRT). Three clinical patients with CNS tumors were planned with two different MLC leaf sizes, 5 and 10 mm, representing Varian-120 and Varian-80 Millennium multileaf collimators, respectively. Two sets of IMRT treatment plans were developed. The goal of the first set was radiation dose conformality in three dimensions. The goal for the second set was organ avoidance of a nearby critical structure while maintaining adequate coverage of the target volume. Treatment planning utilized the CadPlan/Helios system (Varian Medical Systems, Milpitas CA) for dynamic MLC treatment delivery. All beam parameters and optimization (cost function) parameters were identical for the 5 and 10 mm plans. For all cases the number of beams, gantry positions, and table positions were taken from clinically treated three-dimensional conformal radiotherapy plans. Conformality was measured by the ratio of the planning isodose volume to the target volume. Organ avoidance was measured by the volume of the critical structure receiving greater than 90% of the prescription dose (V/sub 90/). For three patients with squamous cell carcinoma of the head and neck (T2-T4 N0-N2c M0) 5 and 10 mm leaf widths were compared for parotid preservation utilizing nine coplanar equally spaced beams delivering a simultaneous integrated boost. Because modest differences in physical dose to the parotid were detected, a NTCP model based upon the clinical parameters of Eisbruch et al. was then used for comparisons. The conformality improved in all three CNS cases for the 5 mm plans compared to the 10 mm plans. For the organ avoidance plans, V/sub 90/ also improved in two of the three cases when the 5 mm leaf width was utilized for IMRT treatment delivery. In the third case, both the 5 and 10 mm plans were able to spare the critical structure with none of the structure receiving more than 90% of the prescription dose, but in the moderate dose range, less dose was delivered to the critical structure with the 5 mm plan. For the head and neck cases both the 5 and 10*2.5 mm beamlets dMLC sliding window techniques spared the contralateral parotid gland while maintaining target volume coverage. The mean parotid dose was modestly lower with the smaller beamlet size (21.04 Gy vs 22.36 Gy). The resulting average NTCP values were 13.72% for 10 mm dMLC and 8.24% for 5 mm dMLC. In conclusion, five mm leaf width results in an improvement in physical dose distribution over 10 mm leaf width that may be clinically relevant in some cases. These differences may be most pronounced for single fraction radiosurgery or in cases where the tolerance of the sensitive organ is less than or close to the target volume prescription
[ "multileaf collimator leaf width", "physical dose distributions", "head and neck neoplasms", "intensity modulated radiation therapy", "10 mm", "CNS tumors", "treatment planning", "conformal radiotherapy", "parotid preservation", "5 mm", "beamlet size", "21.04 Gy", "22.36 Gy", "single fraction radiosurgery", "CNS neoplasms", "optimization parameters", "acceptable tumor coverage", "minimal toxicity", "collimator rotation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M", "U", "M" ]
Production capacity of flexible manufacturing systems with fixed production ratios
Determining the production capacity of flexible manufacturing systems is a very important issue in the design of such systems. We propose an approach for determining the production capacity (i.e. the maximum production rate) of a flexible manufacturing system with several part types, dedicated pallets, and fixed production ratios among the different part types. We show that the problem reduces to the determination of a single parameter for which we propose an iterative procedure. Simulation or approximate analytical techniques can be used as the building block performance evaluation technique in the iterative procedure
[ "production capacity", "flexible manufacturing systems", "fixed production ratios", "maximum production rate", "dedicated pallets", "iterative procedure", "simulation", "approximate analytical techniques", "building block performance evaluation technique", "multiple part type", "single parameter determination", "stability condition", "numerical experiments" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R", "U", "U" ]
Ventilation-perfusion ratio of signal intensity in human lung using oxygen-enhanced and arterial spin labeling techniques
This study investigates the distribution of ventilation-perfusion (V/Q) signal intensity (SI) ratios using oxygen-enhanced and arterial spin labeling (ASL) techniques in the lungs of 10 healthy volunteers. Ventilation and perfusion images were simultaneously acquired using the flow-sensitive alternating inversion recovery (FAIR) method as volunteers alternately inhaled room air and 100% oxygen. Images of the T/sub 1/ distribution were calculated for five volunteers for both selective (T/sub 1f/) and nonselective (T/sub 1/) inversion. The average T/sub 1/ was 1360 ms+or-116 ms, and the average T/sub 1f/ was 1012 ms+or-112 ms, yielding a difference that is statistically significant (P<0.002). Excluding large pulmonary vessels, the average V/Q SI ratios were 0.355+or-0.073 for the left lung and 0.371+or-0.093 for the right lung, which are in agreement with the theoretical V/Q SI ratio. Plots of the WO SI ratio are similar to the logarithmic normal distribution obtained by multiple inert gas elimination techniques, with a range of ratios matching ventilation and perfusion. This MRI V/Q technique is completely noninvasive and does not involve ionized radiation. A limitation of this method is the nonsimultaneous acquisition of perfusion and ventilation data, with oxygen administered only for the ventilation data
[ "ventilation-perfusion ratio", "signal intensity", "human lung", "arterial spin labeling techniques", "perfusion images", "flow-sensitive alternating inversion recovery", "logarithmic normal distribution", "multiple inert gas elimination", "MRI", "nonsimultaneous acquisition", "oxygen-enhanced techniques", "ventilation images", "gas exchange efficiency", "pathomechanisms", "time delay", "pixel-by-pixel maps", "pulmonary embolism", "chronic obstructive pulmonary disease" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M", "U", "U", "U", "M", "M" ]
Laser-based internal profile measurement system
An automatic laser-based system to measure the internal profiles of various structures has been developed. The system uses a point laser source through a rotating optical device fixed on to a laser measurement meter. A notebook computer with custom software is used to control the laser meter and rotating device to estimate the scanned profile shape and to determine the resulting cross-section area. The information provided by this system is essential to construction industry, including window and door builders; the glass, panel, board, and floor tile manufacturers; carpet venders; and building contractors for cost estimation and production control. As a result, the lead time for delivering the customized windowpanes, woodwork, floor tiles, and ceilings can be reduced. Applications of this system for measuring the shapes of window frames and floor plans are described and demonstrated. The measurement accuracy is evaluated and analyzed. Results have indicated that the measurement accuracy can be achieved within 4% of the measurement distance, for typical window designs and floor patterns required by major window manufacturers. Recommendations to improve the system are also included
[ "laser-based internal profile measurement system", "internal profiles", "point laser source", "rotating optical device", "laser meter", "rotating device", "floor tile manufacturers", "carpet venders", "building contractors", "cost estimation", "production control", "customized windowpanes", "window frames" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
Quadratic programming algorithms for large-scale model predictive control
Quadratic programming (QP) methods are an important element in the application of model predictive control (MPC). As larger and more challenging MPC applications are considered, more attention needs to be focused on the construction and tailoring of efficient QP algorithms. In this study, we tailor and apply a new QP method, called QPSchur, to large MPC applications, such as cross directional control problems in paper machines. Written in C++, QPSchur is an object oriented implementation of a novel dual space, Schur complement algorithm. We compare this approach to three widely applied QP algorithms and show that QPSchur is significantly more efficient (up to two orders of magnitude) than the other algorithms. In addition, detailed simulations are considered that demonstrate the importance of the flexible, object oriented construction of QPSchur, along with additional features for constraint handling, warm starts and partial solution
[ "quadratic programming algorithms", "large-scale model predictive control", "QPSchur", "cross directional control problems", "paper machines", "object oriented implementation", "simulations", "constraint handling", "warm starts", "partial solution", "dual space Schur complement algorithm", "flexible object oriented construction" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
War games: The truth [network security]
With al Qaeda on the tip of tongues around the world, find out how terror groups could target your network. What are the dangers and how do you fight them?
[ "networks", "security", "malicious attacks", "employees" ]
[ "P", "P", "U", "U" ]
Transcripts: bane or boon? [law reporting]
Because judge-made law, by its very nature, is less immediately accessible than the law of codified, statutory systems, it calls for an efficient system of law reporting. Of necessity, any such system will be selective, the majority of decisions going unreported. Considerable power thereby comes to repose in the hands of the law reporters. The author shares his invaluable perception and extensive research on the difficulties which arise from the excess of access to judgments
[ "transcripts", "law reporting", "judge-made law", "judgments" ]
[ "P", "P", "P", "P" ]
FinancialContent. Credibility is king
If you went to a site named, you'd probably expect to find, well, financial content. Maybe stock prices or company earnings or market charts or economic statistics or corporate news reports. Well, you'd be partially correct. does deal in financial information, but its main objective is not to distribute its financial content to individual investors, but to distribute it through other Web sites. In other words, FinancialContent is a wholesaler, not a retailer. As an aggregator, FinancialContent provides partner sites with financial information that is tailored to that individual Web site
[ "", "financial information", "Web sites", "aggregator", "partner sites" ]
[ "P", "P", "P", "P", "P" ]
Senate to Powell: regulate more [FCC]
FCC Chairman Michael Powell pitched a six-step market-based recovery plan to the Senate last week, but two members of the Commerce Committee told him telecom's revival requires more reliance on regulation
[ "FCC", "recovery plan", "US Senate Commerce Committee", "telecom industry" ]
[ "P", "P", "M", "M" ]
What do you say? Open letters to women considering a computer science major
In the last decade we have both monitored with great interest the ratio of female to male computer science majors at our respective institutions. With each entering class, we think: "Surely, now is the time when the numbers will become more balanced." Logic tells us that this must eventually happen, because the opportunities in computing are simply too attractive for an entire segment of our population to routinely pass up. But each year we are again disappointed in the number of women students, as they continue to be woefully under-represented among computer science majors. So, what do you say to a young woman who is considering a college choice and a choice of major in order to make computer science a more attractive option? We have organized some thoughts on that subject into open letters
[ "women", "computer science majors", "female", "male", "computer science education", "gender issues" ]
[ "P", "P", "P", "P", "M", "U" ]
Robust stability analysis for current-programmed regulators
Uncertainty models for the three basic switch-mode converters: buck, boost, and buck-boost are given in this paper. The resulting models are represented by linear fractional transformations with structured dynamic uncertainties. Uncertainties are assumed for the load resistance R=R/sub O/(1+ delta /sub R/), inductance L=L/sub O/(1+ delta /sub L/), and capacitance C=C/sub O/(1+ delta /sub C/). The interest in these models is clearly motivated by the need to have models for switch-mode DC-DC converters that are compatible with robust control analysis, which require a model structure consisting of a nominal model and a norm-bounded modeling uncertainty. Therefore, robust stability analysis can be realized using standard mu -tools. At the end of the paper, an illustrative example is given which shows the simplicity of the procedure
[ "robust stability analysis", "current-programmed regulators", "uncertainty models", "linear fractional transformations", "structured dynamic uncertainties", "load resistance", "inductance", "capacitance", "switch-mode DC-DC converters", "control analysis", "nominal model", "norm-bounded modeling uncertainty", "buck converters", "boost converters", "buck-boost converters" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
A modal logic for indiscernibility and complementarity in information systems
In this paper, we study indiscernibility relations and complementarity relations in information systems, The first-order characterization of indiscernibility and complementarity is obtained through a duality result between information systems and certain structures of relational type characterized by first-order conditions. The modal analysis of indiscernibility and complementarity is performed through a modal logic which modalities correspond to indiscernibility relations and complementarity relations in information systems
[ "modal logic", "indiscernibility", "complementarity", "information systems", "first-order characterization", "duality result", "relational type", "first-order conditions" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
A min-max theorem on feedback vertex sets
We establish a necessary and sufficient condition for the linear system {x : Hx >or= e, x >or= 0} associated with a bipartite tournament to be totally dual integral, where H is the cycle-vertex incidence matrix and e is the all-one vector. The consequence is a min-max relation on packing and covering cycles, together with strongly polynomial time algorithms for the feedback vertex set problem and the cycle packing problem on the corresponding bipartite tournaments. In addition, we show that the feedback vertex set problem on general bipartite tournaments is NP-complete and approximable within 3.5 based on the min-max theorem
[ "min-max theorem", "feedback vertex sets", "linear system", "bipartite tournament", "cycle-vertex incidence matrix", "all-one vector", "covering cycles", "strongly polynomial time algorithms", "feedback vertex set problem", "cycle packing problem", "necessary sufficient condition", "totally dual integral system", "NP-complete problem", "graphs", "combinatorial optimization problems", "linear programming duality theory" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "U", "M", "M" ]
Optimization of cutting conditions for single pass turning operations using a deterministic approach
An optimization analysis, strategy and CAM software for the selection of economic cutting conditions in single pass turning operations are presented using a deterministic approach. The optimization is based on criteria typified by the maximum production rate and includes a host of practical constraints. It is shown that the deterministic optimization approach involving mathematical analyses of constrained economic trends and graphical representation on the feed-speed domain provides a clearly defined strategy that not only provides a unique global optimum solution, but also the software that is suitable for on-line CAM applications. A numerical study has verified the developed optimization strategies and software and has shown the economic benefits of using optimization
[ "single pass turning operations", "deterministic approach", "CAM software", "economic cutting conditions", "maximum production rate", "mathematical analyses", "constrained economic trends", "cutting conditions optimization", "process planning" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "U" ]
Development of a computer-aided manufacturing system for profiled edge lamination tooling
Profiled edge lamination (PEL) tooling is a promising rapid tooling (RT) method involving the assembly of an array of laminations whose top edges are simultaneously profiled and beveled based on a CAD model of the intended tool surface. To facilitate adoption of this RT method by industry, a comprehensive PEL tooling development system is proposed. The two main parts of this system are: (1) iterative tool design based on thermal and structural models; and (2) fabrication of the tool using a computer-aided manufacturing (CAM) software and abrasive water jet cutting. CAM software has been developed to take lamination slice data (profiles) from any proprietary RP software in the form of polylines and create smooth, kinematically desirable cutting trajectories for each tool lamination. Two cutting trajectory algorithms, called identical equidistant profile segmentation and adaptively vector profiles projection (AVPP), were created for this purpose. By comparing the performance of both algorithms with a benchmark part shape, the AVPP algorithm provided better cutting trajectories for complicated tool geometries. A 15-layer aluminum PEL tool was successfully fabricated using a 5-axis CNC AWJ cutter and NC code generated by the CAM software
[ "profiled edge lamination tooling", "rapid tooling", "abrasive water jet cutting", "CAM software", "cutting trajectory algorithms", "identical equidistant profile segmentation", "adaptively vector profiles projection", "computer aided manufacturing" ]
[ "P", "P", "P", "P", "P", "P", "P", "M" ]
Keen but confused [workflow & content management]
IT users find workflow, content and business process management software appealing but by no means straightforward to implement. Pat Sweet reports on our latest research
[ "workflow", "content management", "business process management software", "research", "survey", "market overview" ]
[ "P", "P", "P", "P", "U", "U" ]
CRM: approaching zenith
Looks at how manufacturers are starting to warm up to the concept of customer relationship management. CRM has matured into what is expected to be big business. As CRM software evolves to its second, some say third, generation, it's likely to be more valuable to holdouts in manufacturing and other sectors
[ "CRM", "manufacturers", "manufacturers", "customer relationship management", "manufacturing" ]
[ "P", "P", "P", "P", "P" ]
Server safeguards tax service
Peterborough-based tax consultancy IE Taxguard wanted real-time failover protection for important Windows-based applications. Its solution was to implement a powerful failover server from UK supplier Neverfail in order to provide real-time backup for three core production servers
[ "tax consultancy", "IE Taxguard", "failover server", "Neverfail", "backup" ]
[ "P", "P", "P", "P", "P" ]
Modeling dynamic objects in distributed systems with nested Petri nets
Nested Petri nets (NP-nets) is a Petri net extension, allowing tokens in a net marking to be represented by marked nets themselves. The paper discusses applicability of NP-nets for modeling task planning systems, multi-agent systems and recursive-parallel systems. A comparison of NP-nets with some other formalisms, such as OPNs of R. Valk (2000), recursive parallel programs of O. Kushnarenko and Ph. Schnoebelen (1997) and process algebras is given. Some aspects of decidability for object-oriented Petri net extensions are also discussed
[ "distributed systems", "nested Petri nets", "multi-agent systems", "recursive-parallel systems", "process algebras", "decidability", "object-oriented Petri net", "dynamic objects modelling" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
Adaptive image enhancement for retinal blood vessel segmentation
Retinal blood vessel images are enhanced by removing the nonstationary background, which is adaptively estimated based on local neighbourhood information. The result is a much better segmentation of the blood vessels with a simple algorithm and without the need to obtain a priori illumination knowledge of the imaging system
[ "adaptive image enhancement", "retinal blood vessel images", "local neighbourhood information", "nonstationary background removal", "image segmentation", "personal identification", "security applications" ]
[ "P", "P", "P", "R", "R", "U", "U" ]
Efficient allocation of knowledge in distributed business structures
Accelerated business processes demand new concepts and realizations of information systems and knowledge databases. This paper presents the concept of the collaborative information space (CIS), which supplies the necessary tools to transform individual knowledge into collective useful information. The creation of 'information objects' in the CIS allows an efficient allocation of information in all business process steps at any time. Furthermore, the specific availability of heterogeneous, distributed data is realized by a Web-based user interface, which enables effective search by a multidimensionally hierarchical composition
[ "distributed business structures", "accelerated business processes", "information systems", "knowledge databases", "collaborative information space", "information objects", "business process steps", "Web-based user interface", "multidimensionally hierarchical composition", "efficient knowledge allocation", "heterogeneous distributed data", "interactive system" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M" ]
Applying BGL to computational geometry
The author applies Boost Graph Library to the domain of computational geometry. First, he formulates a concrete problem in graph terms. Second, he develops a way to transform the output of an existing algorithm into an appropriate Boost Graph Library data structure. Finally, he implements two new algorithms for my Boost Graph Library graph. The first algorithm gets the job done, but could have been written in any programming language. The second algorithm, however, shows the power of Boost Graph Library's generic programming approach.Graphs, graphics, and generic programming combine in this novel use of the Boost Graph Library
[ "computational geometry", "Boost Graph Library", "generic programming approach", "Boost libraries", "C++", "threads", "smart pointers", "graph-theoretic concepts", "directed graph", "file dependencies", "BGL graph" ]
[ "P", "P", "P", "R", "U", "U", "U", "U", "M", "U", "R" ]
Accelerating filtering techniques for numeric CSPs
Search algorithms for solving Numeric CSPs (Constraint Satisfaction Problems) make an extensive use of filtering techniques. In this paper we show how those filtering techniques can be accelerated by discovering and exploiting some regularities during the filtering process. Two kinds of regularities are discussed, cyclic phenomena in the propagation queue and numeric regularities of the domains of the variables. We also present in this paper an attempt to unify numeric CSPs solving methods from two distinct communities, that of CSP in artificial intelligence, and that of interval analysis
[ "filtering techniques", "Numeric CSPs", "search algorithms", "Constraint Satisfaction Problems", "propagation", "artificial intelligence", "interval analysis", "CSPs-solving", "extrapolation methods", "pruning" ]
[ "P", "P", "P", "P", "P", "P", "P", "U", "M", "U" ]
Virtual Development Center
The Virtual Development Center of the Institute for Women and Technology seeks to significantly enhance the impact of women on technology. It addresses this goal by increasing the number of women who have input on created technology, enhancing the ways people teach and develop technology, and developing need-based technology that serves the community. Through activities of the Virtual Development Center, a pattern is emerging regarding how computing technologies do or do not satisfy the needs of community groups, particularly those communities serving women. This paper describes the Virtual Development Center program and offers observations on the impact of computing technology on non-technical communities
[ "Virtual Development Center", "women", "teaching", "community groups", "information technology", "gender issues", "computer science education" ]
[ "P", "P", "P", "P", "M", "U", "M" ]
Access matters
Discusses accessibility needs of people with disabilities, both from the perspective of getting the information from I&R programs (including accessible Web sites, TTY access, Braille, and other mechanisms) and from the perspective of being aware of accessibility needs when referring clients to resources. Includes information on ADA legislation requiring accessibility to public places and recommends several organizations and Web sites for additional information
[ "accessibility needs", "accessible Web sites", "TTY access", "Braille", "ADA legislation", "public places", "disabled people", "information and referral programs" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
Exploratory study of the adoption of manufacturing technology innovations in the USA and the UK
Manufacturing technologies, appropriately implemented, provide competitive advantage to manufacturers. The use of manufacturing technologies across countries is difficult to compare. One such comparison has been provided in the literature with a study of US and Japanese practices in advanced manufacturing technology use using a common questionnaire. The present study compares the use of 17 different technologies in similar industries in the USA (n=1025) and UK (n=166) using a common questionnaire. Largely, there are remarkable similarities between the two countries. This may partly correlate with the heavy traffic in foreign direct investment between the two nations. Notable differences are (1) across-the-board, US manufacturers are ahead of the UK firms in computerized integration with units inside and outside manufacturing organizations; (2) US manufacturers show higher labour productivity, which is consistent with macro-economic data, and (3) more UK manufacturers report the use of soft technologies such as just-in-time, total quality manufacturing and manufacturing cells. Hypotheses for future investigation are proposed
[ "manufacturing technology innovations", "USA", "UK", "competitive advantage", "foreign direct investment", "labour productivity", "macro-economic data", "soft technologies", "just-in-time", "total quality manufacturing", "manufacturing cells" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
Work in progress: Developing policies for access to government information in the New South Africa
Following South Africa's transition to democracy in 1994, the SA government has adopted policies supporting freedom of expression and freedom of access to information. The Bill of Rights in the new Constitution includes a constitutional right of access to information held by the state. Since 1994 various initiatives have been taken by government and other bodies to promote such access. These include moves to reorganize government printing and publishing, restructure the government's public information services, make government information available on the Internet, and extend telephony and Internet access to poor communities. SA's new Legal Deposit Act, (1997) makes provision for the creation of official publications depositories. The Promotion of Access to Information Act, (2000) was enacted to ensure access to information held by the state and public bodies. However, despite much activity, it has proved difficult to translate principles into practical and well-coordinated measures to improve access to government information. A specific concern is the failure of policy-makers to visualize a role for libraries
[ "government information", "South Africa", "freedom of expression", "freedom of access to information", "Bill of Rights", "constitutional right of access", "government printing", "public information services", "Internet", "official publications depositories", "public bodies", "libraries", "government publishing" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
Networking without wires
Several types of devices use radio transmitters to send data over thin air. Are WLANs, wireless local area networks, the end to all cables? Will Dalrymple weighs up the costs and benefits
[ "wireless local area networks", "costs", "benefits" ]
[ "P", "P", "P" ]
Entangling atoms in bad cavities
We propose a method to produce entangled spin squeezed states of a large number of atoms inside an optical cavity. By illuminating the atoms with bichromatic light, the coupling to the cavity induces pairwise exchange of excitations which entangles the atoms. Unlike most proposals for entangling atoms by cavity QED, our proposal does not require the strong coupling regime g/sup 2// kappa Gamma >>1, where g is the atom cavity coupling strength, kappa is the cavity decay rate, and Gamma is the decay rate of the atoms. In this work the important parameter is Ng/sup 2// kappa Gamma , where N is the number of atoms, and our proposal permits the production of entanglement in bad cavities as long as they contain a large number of atoms
[ "bad cavities", "entangled spin squeezed states", "optical cavity", "coupling", "pairwise exchange", "excitations", "cavity QED", "strong coupling regime", "atom cavity coupling strength", "cavity decay rate", "atom entanglement", "bichromatic light illumination" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
Use of natural language processing to translate clinical information from a database of 889,921 chest radiographic reports
The aim was to evaluate translation of chest radiographic reports using natural language processing and to compare the findings with those in the literature. A natural language processor coded 10 years of narrative chest radiographic reports from an urban academic medical center. Coding for 150 reports was compared with manual coding. Frequencies and cooccurrences of 24 clinical conditions (diseases, abnormalities, and clinical states) were estimated. The ratio of right to left lung mass, association of pleural effusion with other conditions, and frequency of bullet and stab wounds were compared with independent observations. The sensitivity and specificity of the system's pneumothorax coding were compared with those of manual financial coding. Internal and external validation in this study confirmed the accuracy of natural language processing for translating chest radiographic narrative reports into a large database of information
[ "natural language processing", "urban academic medical center", "pleural effusion", "stab wounds", "pneumothorax coding", "chest radiographic report database", "clinical information translation", "clinical condition frequency", "clinical condition cooccurrence", "right to left lung mass ratio", "bullet wounds" ]
[ "P", "P", "P", "P", "P", "R", "R", "R", "R", "R", "R" ]
Choice preferences without inferences: subconscious priming of risk attitudes
We present a procedure for subconscious priming of risk attitudes. In Experiment 1, we were reliably able to induce risk-seeking or risk-averse preferences across a range of decision scenarios using this priming procedure. In Experiment 2, we showed that these priming effects can be reversed by drawing participants' attention to the priming event. Our results support claims that the formation of risk preferences can be based on preconscious processing, as for example postulated by the affective primacy hypothesis, rather than rely on deliberative mental operations, as posited by several current models of judgment and decision making
[ "choice preferences", "subconscious priming", "risk attitudes", "risk-averse preferences", "decision scenarios", "preconscious processing", "affective primacy hypothesis", "deliberative mental operations", "risk-seeking preferences" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
A self-organizing context-based approach to the tracking of multiple robot trajectories
We have combined competitive and Hebbian learning in a neural network designed to learn and recall complex spatiotemporal sequences. In such sequences, a particular item may occur more than once or the sequence may share states with another sequence. Processing of repeated/shared states is a hard problem that occurs very often in the domain of robotics. The proposed model consists of two groups of synaptic weights: competitive interlayer and Hebbian intralayer connections, which are responsible for encoding respectively the spatial and temporal features of the input sequence. Three additional mechanisms allow the network to deal with shared states: context units, neurons disabled from learning, and redundancy used to encode sequence states. The network operates by determining the current and the next state of the learned sequences. The model is simulated over various sets of robot trajectories in order to evaluate its storage and retrieval abilities; its sequence sampling effects; its robustness to noise and its tolerance to fault
[ "self-organizing context-based approach", "robot trajectories", "Hebbian learning", "complex spatiotemporal sequences", "shared states", "synaptic weights", "Hebbian intralayer connections", "context units", "sequence states", "retrieval abilities", "sequence sampling effects", "trajectories tracking", "competitive learning", "competitive interlayer connections", "unsupervised learning", "storage abilities", "fault tolerance" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "M", "R", "R" ]
Intensity based affine registration including feature similarity for spatial normalization
This paper presents a new spatial normalization with affine transformation. The quantitative comparison of brain architecture across different subjects requires a common coordinate system. For the analysis of a specific brain area, it is necessary to normalize and compare a region of interest and the global brain. The intensity based registration method matches the global brain well, but a region of interest may not be locally normalized compared to the feature based method. The method in this paper uses feature similarities of local regions as well as intensity similarities. The lateral ventricle and central gray nuclei of the brain, including the corpus callosum, which is used for features in schizophrenia detection, is appropriately normalized. Our method reduces the difference of feature areas such as the corpus callosum (7.7%, 2.4%) and lateral ventricle (8.2%, 13.5%) compared with mutual information and Talairach methods
[ "intensity based affine registration", "feature similarity", "feature similarity", "spatial normalization", "affine transformation", "brain architecture", "common coordinate system", "region of interest", "global brain", "lateral ventricle", "central gray nuclei", "corpus callosum", "schizophrenia detection", "Talairach method", "feature similarities", "mutual information method" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
Digital stochastic realization of complex analog controllers
Stochastic logic is based on digital processing of a random pulse stream, where the information is codified as the probability of a high level in a finite sequence. This binary pulse sequence can be digitally processed exploiting the similarity between Boolean algebra and statistical algebra. Given a random pulse sequence, any Boolean operation among individual pulses will correspond to an algebraic expression among the variables represented by their respective average pulse rates. Subsequently, this pulse stream can be digitally processed to perform analog operations. In this paper, we propose a stochastic approach to the digital implementation of complex controllers using programmable devices as an alternative to traditional digital signal processors. As an example, a practical realization of nonlinear dissipative controllers for a series resonant converter is presented
[ "digital stochastic realization", "complex analog controllers", "stochastic logic", "random pulse stream", "pulse stream", "finite sequence", "binary pulse sequence", "Boolean algebra", "statistical algebra", "random pulse sequence", "Boolean operation", "average pulse rates", "stochastic approach", "programmable devices", "nonlinear dissipative controllers", "series resonant converter", "parallel resonant DC-to-DC converters", "series resonant DC-to-DC converters" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M" ]
Comparison of push and pull systems with transporters: a metamodelling approach
Analyses push and pull systems with transportation consideration. A multiproduct, multiline, multistage production system was used to compare the two systems. The effects of four factors (processing time variation, demand variation, transporters, batch size) on throughput rate, average waiting time in the system and machine utilization were studied. The study uses metamodels to compare the two systems. They serve a dual purpose of expressing system performance measures in the form of a simple equation and reducing computational time when comparing the two systems. Research shows that the number of transporters used and the batch size have a significant effect on the performance measures of both systems
[ "pull systems", "transporters", "metamodelling approach", "processing time variation", "demand variation", "batch size", "throughput rate", "average waiting time", "machine utilization", "performance measures", "push systems", "multiproduct multiline multistage production system" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
Cat and class: what use are these skills to the new legal information professional?
This article looks at the cataloguing and classification skills taught on information studies courses and the use these skills are to new legal information professionals. The article is based on the opinions of nine new legal information professionals from both academic and law firm libraries
[ "legal information professional", "cataloguing", "classification", "information studies courses", "law firm libraries", "academic libraries" ]
[ "P", "P", "P", "P", "P", "R" ]
Resolution of a current-mode algorithmic analog-to-digital converter
Errors limiting the resolution of current-mode algorithmic analog-to-digital converters are mainly related to current mirror operation. While systematic errors can be minimized by proper circuit techniques, random sources are unavoidable. In this paper a statistical analysis of the resolution of a typical converter is carried out taking into account process tolerances. To support the analysis, a 4-bit ADC, realized in a 0.35- mu m CMOS technology, was exhaustively simulated. Results were found to be in excellent agreement with theoretical derivations
[ "resolution", "analog-to-digital converters", "circuit techniques", "statistical analysis", "CMOS technology", "current-mode ADC", "algorithmic ADC", "A/D converters", "error analysis", "tolerance analysis", "circuit analysis", "0.35 micron", "4 bit" ]
[ "P", "P", "P", "P", "P", "R", "R", "M", "R", "R", "R", "U", "U" ]
Time-integration of multiphase chemistry in size-resolved cloud models
The existence of cloud drops leads to a transfer of chemical species between the gas and aqueous phases. Species concentrations in both phases are modified by chemical reactions and by this phase transfer. The model equations resulting from such multiphase chemical systems are nonlinear, highly coupled and extremely stiff. In the paper we investigate several numerical approaches for treating such processes. The droplets are subdivided into several classes. This decomposition of the droplet spectrum into classes is based on their droplet size and the amount of scavenged material inside the drops, respectively. The very fast dissociations in the aqueous phase chemistry are treated as forward and backward reactions. The aqueous phase and gas phase chemistry, the mass transfer between the different droplet classes among themselves and with the gas phase are integrated in an implicit and coupled manner by the second order BDF method. For this part we apply a modification of the code LSODE with special linear system solvers. These direct sparse techniques exploit the special block structure of the corresponding Jacobian. Furthermore we investigate an approximate matrix factorization which is related to operator splitting at the linear algebra level. The sparse Jacobians are generated explicitly and stored in a sparse form. The efficiency and accuracy of our time-integration schemes is discussed for four multiphase chemistry systems of different complexity and for a different number of droplet classes
[ "multiphase chemistry", "size-resolved cloud models", "cloud drops", "chemical species", "chemical reactions", "multiphase chemical systems", "aqueous phase chemistry", "gas phase chemistry", "approximate matrix factorization", "operator splitting", "linear algebra", "sparse Jacobians", "time-integration schemes", "air pollution modelling" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
Resonant controllers for smart structures
In this paper we propose a special type of colocated feedback controller for smart structures. The controller is a parallel combination of high-Q resonant circuits. Each of the resonant circuits is tuned to a pole (or the resonant frequency) of the smart structure. It is proven that the parallel combination of resonant controllers is stable with an infinite gain margin. Only one set of actuator-sensor can damp multiple resonant modes with the resonant controllers. Experimental results are presented to show the robustness of the proposed controller in damping multimode resonances
[ "smart structures", "smart structures", "feedback controller", "high-Q resonant circuits", "resonant frequency", "actuator-sensor", "damping", "multiple resonant modes", "multimode resonances", "smart structure", "laminate beam" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
Public business libraries: the next chapter
Traces the history of the provision of business information by Leeds Public Libraries, UK, from the opening of the Public Commercial and Technical Library in 1918 to the revolutionary impact of the Internet in the 1990s. Describes how the Library came to terms with the need to integrate the Internet into its mainstream business information services, with particular reference to its limitations and to the provision of company information, market research, British Standards information, press cuttings and articles from specialized trade and scientific journals, and patents information. Focuses on some of the reasons why the public business library is still needed as a service to businesses, even after the introduction of the Internet and considers the Library's changing role and the need to impress on all concerned, especially government, the continuing value of these services. Looks to the partnerships formed by the Library over the years and the ways in which these are expected to assist in realizing future opportunities, in particular, the fact that all public libraries in England gained free Internet access at the end of 2001. Offers some useful ideas about how the Library could develop, noting that SINTO, a Sheffield based information network formed in 1938 and originally a partnership between the public library, the two Sheffield universities and various leading steel companies of the time, is being examined as a model for future services in Leeds. Concludes that the way forward can be defined in terms of five actions: redefinition of priorities; marketing; budgets; resources; and the use of information technology (IT)
[ "public business libraries", "history", "Leeds Public Libraries", "Public Commercial and Technical Library", "Internet", "business information services", "company information", "market research", "marketing", "British Standards information", "press cuttings", "patents information", "government", "SINTO", "information network", "Sheffield universities", "steel companies", "budgets", "resources", "trade journal articles", "scientific journal articles", "priority redefinition", "IT use" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R" ]
Process pioneers [agile business]
By managing IT infrastructures along so-called 'top down' lines, organisations can streamline their business processes, eliminate redundant tasks and increase automation
[ "agile business", "managing IT infrastructures", "business processes", "increase automation" ]
[ "P", "P", "P", "P" ]
Text-independent speaker verification using utterance level scoring and covariance modeling
This paper describes a computationally simple method to perform text independent speaker verification using second order statistics. The suggested method, called utterance level scoring (ULS), allows one to obtain a normalized score using a single pass through the frames of the tested utterance. The utterance sample covariance is first calculated and then compared to the speaker covariance using a distortion measure. Subsequently, a distortion measure between the utterance covariance and the sample covariance of data taken from different speakers is used to normalize the score. Experimental results from the 2000 NIST speaker recognition evaluation are presented for ULS, used with different distortion measures, and for a Gaussian mixture model (GMM) system. The results indicate that ULS as a viable alternative to GMM whenever the computational complexity and verification accuracy needs to be traded
[ "text-independent speaker verification", "utterance level scoring", "covariance modeling", "computationally simple method", "second order statistics", "normalized score", "sample covariance", "speaker covariance", "distortion measure", "distortion measure", "NIST speaker recognition evaluation", "Gaussian mixture model", "GMM", "computational complexity", "verification accuracy", "distortion measures" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
Deterministic calculations of photon spectra for clinical accelerator targets
A method is proposed to compute photon energy spectra produced in clinical electron accelerator targets, based on the deterministic solution of the Boltzmann equation for coupled electron-photon transport in one-dimensional (1-D) slab geometry. It is shown that the deterministic method gives similar results as Monte Carlo calculations over the angular range of interest for therapy applications. Relative energy spectra computed by deterministic and 3-D Monte Carlo methods, respectively, are compared for several realistic target materials and different electron beams, and are found to give similar photon energy distributions and mean energies. The deterministic calculations typically require 1-2 mins of execution time on a Sun Workstation, compared to 2-36 h for the Monte Carlo runs
[ "deterministic calculations", "photon energy spectra", "clinical electron accelerator targets", "Boltzmann equation", "coupled electron-photon transport", "angular range of interest", "therapy applications", "relative energy spectra", "3-D Monte Carlo methods", "one-dimensional slab geometry", "linear accelerator", "therapy planning", "integrodifferential equation", "pencil beam source representations" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "M", "M", "M" ]
Is open source more or less secure?
Networks dominate today's computing landscape and commercial technical protection is lagging behind attack technology. As a result, protection programme success depends more on prudent management decisions than on the selection of technical safeguards. The paper takes a management view of protection and seeks to reconcile the need for security with the limitations of technology
[ "commercial technical protection", "attack technology", "management", "open source software security", "computer networks", "data security" ]
[ "P", "P", "P", "M", "R", "M" ]
Simulation of cardiovascular physiology: the diastolic function(s) of the heart
The cardiovascular system was simulated by using an equivalent electronic circuit. Four sets of simulations were performed. The basic variables investigated were cardiac output and stroke volume. They were studied as functions (i) of right ventricular capacitance and negative intrathoracic pressure; (ii) of left ventricular relaxation and of heart rate; and (iii) of left ventricle failure. It seems that a satisfactory simulation of systolic and diastolic functions of the heart is possible. Presented simulations improve our understanding of the role of the capacitance of both ventricles and of the diastolic relaxation in cardiovascular physiology
[ "simulation", "cardiovascular physiology", "diastolic function", "heart", "equivalent electronic circuit", "cardiac output", "stroke volume", "right ventricular capacitance", "negative intrathoracic pressure", "left ventricular relaxation", "heart rate", "left ventricle failure", "diastolic relaxation", "systolic functions" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
A comparison of high-power converter topologies for the implementation of FACTS controllers
This paper compares four power converter topologies for the implementation of flexible AC transmission system (FACTS) controllers: three multilevel topologies (multipoint clamped (MPC), chain, and nested cell) and the well-established multipulse topology. In keeping with the need to implement very-high-power inverters, switching frequency is restricted to line frequency. The study addresses device count, DC filter ratings, restrictions on voltage control, active power transfer through the DC link, and balancing of DC-link voltages. Emphasis is placed on capacitor sizing because of its impact on the cost and size of the FACTS controller. A method for the dimensioning the DC capacitor filter is presented. It is found that the chain converter is attractive for the implementation of a static compensator or a static synchronous series compensator. The MPC converter is attractive for the implementation of a unified power flow controller or an interline power flow controller, but a special arrangement is required to overcome the limitations on voltage control
[ "FACTS controllers", "multilevel topologies", "multipulse topology", "inverters", "switching frequency", "device count", "DC filter ratings", "static compensator", "static synchronous series compensator", "unified power flow controller", "high-power converter topologies comparison", "multipoint clamped topology", "STATCOM", "UPFC" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U", "U" ]
Work sequencing in a manufacturing cell with limited labour constraints
This study focuses on the analysis of group scheduling heuristics in a dual-constrained, automated manufacturing cell, where labour utilization is limited to setups, tear-downs and loads/unloads. This scenario is realistic in today's automated manufacturing cells. The results indicate that policies for allocating labour to tasks have very little impact in such an environment. Furthermore, the performance of efficiency oriented, exhaustive, group scheduling heuristics deteriorated while the performance of the more complex, non-exhaustive heuristics improved. Thus, it is recommended that production managers use the simplest labour scheduling policy, and instead focus their efforts to activities such as job scheduling and production planning in such environments
[ "work sequencing", "manufacturing cell", "limited labour constraints", "group scheduling heuristics", "automated manufacturing cells", "job scheduling", "production planning", "dual-constrained automated manufacturing cell", "labour allocation policies", "efficiency oriented exhaustive group scheduling heuristics", "nonexhaustive heuristics" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "M" ]
Stability in the numerical solution of the heat equation with nonlocal boundary conditions
This paper deals with numerical methods for the solution of the heat equation with integral boundary conditions. Finite differences are used for the discretization in space. The matrices specifying the resulting semidiscrete problem are proved to satisfy a sectorial resolvent condition, uniformly with respect to the discretization parameter. Using this resolvent condition, unconditional stability is proved for the fully discrete numerical process generated by applying A( theta )-stable one-step methods to the semidiscrete problem. This stability result is established in the maximum norm; it improves some previous results in the literature in that it is not subject to various unnatural restrictions which were imposed on the boundary conditions and on the one-step methods
[ "stability", "numerical solution", "heat equation", "nonlocal boundary conditions", "integral boundary conditions", "finite differences", "matrices", "semidiscrete problem", "sectorial resolvent condition", "fully discrete numerical process", "one-step methods", "maximum norm", "space discretization" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
Santera targets independents in major strategy overhaul [telecom]
With big carriers slashing capital expense budgets, Santera Systems is broadening the reach of its next-generation switching platform to include independent telcos. This week, the vendor will announce that it has signed a deal with Kerman, Calif-based Kerman Telephone Co. Furthermore, the company is angling for inclusion in the Rural Utilities Service's approved equipment list, hoping to sell its Class 5 replacement boxes to the smallest carriers. The move is almost a complete reversal for the Plano, Texas-based vendor, which previously focused solely on large carriers, including the RBOCs
[ "Santera Systems", "switching", "Kerman Telephone", "Rural Utilities Service" ]
[ "P", "P", "P", "P" ]
Establishing the discipline of physics-based CMP modeling
For the past decade, a physically based comprehensive process model for chemical mechanical polishing has eluded the semiconductor industry. However, a long-term collaborative effort has now resulted in a workable version of that approach. The highly fundamental model is based on advanced finite element analysis and is beginning to show promise in CMP process development
[ "CMP", "chemical mechanical polishing", "finite element analysis", "CMP process development", "physically based process model" ]
[ "P", "P", "P", "P", "R" ]
Eliminating recency with self-review: the case of auditors' 'going concern' judgments
This paper examines the use of self-review to debias recency. Recency is found in the 'going concern' judgments of staff auditors, but is successfully eliminated by the auditor's use of a simple self-review technique that would be extremely easy to implement in audit practice. Auditors who self-review are also less inclined to make audit report choices that are inconsistent with their going concern judgments. These results are important because the judgments of staff auditors often determine the type and extent of documentation in audit workpapers and serve as preliminary inputs for senior auditors' judgments and choices. If staff auditors' judgments are affected by recency, the impact of this bias may be impounded in the ultimate judgments and choices of senior auditors. Since biased judgments can expose auditors to significant costs involving extended audit procedures, legal liability and diminished reputation, simple debiasing techniques that reduce this exposure are valuable. The paper also explores some future research needs and other important issues concerning judgment debiasing in applied professional settings
[ "self-review", "staff auditors", "audit report choices", "documentation", "audit workpapers", "senior auditors", "extended audit procedures", "legal liability", "diminished reputation", "judgment debiasing", "applied professional settings", "auditor going concern judgments", "recency debiasing", "accountability", "probability judgments" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U", "M" ]
A scalable intelligent takeoff controller for a simulated running jointed leg
Running with jointed legs poses a difficult control problem in robotics. Neural controllers are attractive because they allow the robot to adapt to changing environmental conditions. However, scalability is an issue with many neural controllers. The paper describes the development of a scalable neurofuzzy controller for the takeoff phase of the running stride. Scalability is achieved by selecting a controller whose size does not grow with the dimensionality of the problem. Empirical results show that with proper design the takeoff controller scales from a leg with a single movable link to one with three movable links without a corresponding growth in size and without a loss of accuracy
[ "scalable intelligent takeoff controller", "scalability", "simulated running jointed leg", "neural controllers", "changing environmental conditions", "scalable neurofuzzy controller", "takeoff phase", "running stride", "intelligent robotic control" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
Perceptual audio coding using adaptive pre- and post-filters and lossless compression
This paper proposes a versatile perceptual audio coding method that achieves high compression ratios and is capable of low encoding/decoding delay. It accommodates a variety of source signals (including both music and speech) with different sampling rates. It is based on separating irrelevance and redundancy reductions into independent functional units. This contrasts traditional audio coding where both are integrated within the same subband decomposition. The separation allows for the independent optimization of the irrelevance and redundancy reduction units. For both reductions, we rely on adaptive filtering and predictive coding as much as possible to minimize the delay. A psycho-acoustically controlled adaptive linear filter is used for the irrelevance reduction, and the redundancy reduction is carried out by a predictive lossless coding scheme, which is termed weighted cascaded least mean squared (WCLMS) method. Experiments are carried out on a database of moderate size which contains mono-signals of different sampling rates and varying nature (music, speech, or mixed). They show that the proposed WCLMS lossless coder outperforms other competing lossless coders in terms of compression ratios and delay, as applied to the pre-filtered signal. Moreover, a subjective listening test of the combined pre-filter/lossless coder and a state-of-the-art perceptual audio coder (PAC) shows that the new method achieves a comparable compression ratio and audio quality with a lower delay
[ "perceptual audio coding", "lossless compression", "high compression ratio", "low encoding/decoding delay", "source signals", "music", "sampling rates", "redundancy reduction", "adaptive filtering", "predictive coding", "psycho-acoustically controlled adaptive linear filter", "irrelevance reduction", "predictive lossless coding", "weighted cascaded least mean squared", "WCLMS lossless coder", "subjective listening test", "pre-filter/lossless coder", "audio quality", "adaptive pre-filters", "adaptive post-filters" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
A unified view for vector rotational CORDIC algorithms and architectures based on angle quantization approach
Vector rotation is the key operation employed extensively in many digital signal processing applications. In this paper, we introduce a new design concept called Angle Quantization (AQ). It can be used as a design index for vector rotational operation, where the rotational angle is known in advance. Based on the AQ process, we establish a unified design framework for cost-effective low-latency rotational algorithms and architectures. Several existing works, such as conventional COordinate Rotational Digital Computer (CORDIC), AR-CORDIC, MVR-CORDIC, and EEAS-based CORDIC, can be fitted into the design framework, forming a Vector Rotational CORDIC Family. Moreover, we address four searching algorithms to solve the optimization problem encountered in the proposed vector rotational CORDIC family. The corresponding scaling operations of the CORDIC family are also discussed. Based on the new design framework, we can realize high-speed/low-complexity rotational VLSI circuits, whereas without degrading the precision performance in fixed-point implementations
[ "vector rotational CORDIC algorithms", "angle quantization", "digital signal processing applications", "design index", "vector rotational operation", "unified design framework", "low-latency rotational algorithms", "searching algorithms", "optimization problem", "scaling operations", "low-complexity rotational VLSI circuits", "fixed-point implementations", "DSP applications", "greedy searching algorithm", "low-latency rotational architectures", "high-speed rotational VLSI circuits", "trellis-based searching algorithm" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "R", "R", "M" ]
Homogenization in L/sup infinity /
Homogenization of deterministic control problems with L/sup infinity / running cost is studied by viscosity solutions techniques. It is proved that the value function of an L/sup infinity / problem in a medium with a periodic micro-structure converges uniformly on the compact sets to the value function of the homogenized problem as the period shrinks to 0. Our main convergence result extends that of Ishii (Stochastic Analysis, control, optimization and applications, pp. 305-324, Birkhauser Boston, Boston, MA, 1999.) to the case of a discontinuous Hamiltonian. The cell problem is solved, but, as nonuniqueness occurs, the effective Hamiltonian must be selected in a careful way. The paper also provides a representation formula for the effective Hamiltonian and gives illustrations to calculus of variations, averaging and one-dimensional problems
[ "homogenization", "deterministic control", "L/sup infinity / running cost", "value function", "convergence", "cell problem", "calculus of variations", "averaging", "optimal control" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
Induced-shear piezoelectric actuators for rotor blade trailing edge flaps
Much of the current rotorcraft research is focused on improving performance by reducing unwanted helicopter noise and vibration. One of the most promising active rotorcraft vibration control systems is an active trailing edge flap. In this paper, an induced-shear piezoelectric tube actuator is used in conjunction with a simple lever-cusp hinge amplification device to generate a useful combination of trailing edge flap deflections and hinge moments. A finite-element model of the actuator tube and trailing edge flap (including aerodynamic and inertial loading) was used to guide the design of the actuator-flap system. A full-scale induced shear tube actuator flap system was fabricated and bench top testing was conducted to validate the analysis. Hinge moments corresponding to various rotor speeds were applied to the actuator using mechanical springs. The testing demonstrated that for an applied electric field of 3 kV cm/sup -1/ the tube actuator deflected a representative full-scale 12 inch flap +or-2.8 degrees at 0 rpm and +or-1.4 degrees for a hinge moment simulating a 400 rpm condition. The per cent error between the predicted and experimental full-scale flap deflections ranged from 4% (low rpm) to 12.5% (large rpm). Increasing the electric field to 4 kV cm/sup -1/ results in +or-2.5 degrees flap deflection at a rotation speed of 400 rpm, according to the design analysis. A trade study was conducted to compare the performance of the piezoelectric tube actuator to the state of the art in trailing edge flap actuators and indicated that the induced-shear tube actuator shows promise as a trailing edge flap actuator
[ "rotorcraft", "helicopter noise", "vibration control", "active trailing edge flap", "piezoelectric tube actuator", "lever-cusp hinge amplification device", "finite-element model", "inertial loading", "design", "shear tube actuator flap", "bench top testing", "12 inch flap", "12 inch", "induced-shear tube actuator", "helicopter vibration", "aerodynamic loading" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
Techniques for compiling and implementing all NAS parallel benchmarks in HPF
The NAS parallel benchmarks (NPB) are a well-known benchmark set for high-performance machines. Much effort has been made to implement them in High-Performance Fortran (HPF). In previous attempts, however, the HPF versions did not include the complete set of benchmarks, and the performance was not always good. In this study, we implement all eight benchmarks of the NPB in HPF, and parallelize them using an HPF compiler that we have developed. This report describes the implementation techniques and compiler features necessary to achieve good performance. We evaluate the HPF version on the Hitachi SR2201, a distributed-memory parallel machine. With 16 processors, the execution time of the HPF version is within a factor of 1.5 of the hand-parallelized version of the NPB 2.3 beta
[ "compiler", "NAS parallel benchmarks", "high-performance machines", "HPF compiler", "distributed-memory parallel supercomputers" ]
[ "P", "P", "P", "P", "M" ]
Dynamic modification of object Petri nets. An application to modelling protocols with fork-join structures
In this paper we discuss possibilities of modelling protocols by objects in object-based high-level Petri nets. Some advantages of dynamically modifying the structure of token objects are discussed and the need for further investigations into mathematically rigorous foundations of object net formalisms incorporating facilities for such operations on its token nets is emphasised
[ "dynamic modification", "object Petri nets", "protocols", "fork-join structures", "token objects", "mathematically rigorous foundations", "object net formalisms" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
Source/channel coding of still images using lapped transforms and block classification
A novel scheme for joint source/channel coding of still images is proposed. By using efficient lapped transforms, channel-optimised robust quantisers and classification methods it is shown that significant improvements over traditional source/channel coding of images can be obtained while keeping the complexity low
[ "still images", "lapped transforms", "block classification", "channel-optimised robust quantisers", "joint source-channel coding", "image coding", "low complexity" ]
[ "P", "P", "P", "P", "M", "R", "R" ]
Influence of the process design on the control strategy: application in electropneumatic field
This article proposes an example of electropneumatic system where the architecture of the process is modified with respect to both the specifications for position and velocity tracking and a criterion concerning the energy consumption. Experimental results are compared and analyzed using an industrial bench test. For this, a complete model of the system is presented, and two kinds of nonlinear control laws are developed, a monovariable and multivariable type based on the flatness theory
[ "electropneumatic systems", "tracking", "energy consumption", "nonlinear control", "flatness theory", "positioning systems", "position control", "monovariable control", "multivariable control", "velocity control" ]
[ "P", "P", "P", "P", "P", "R", "R", "R", "R", "R" ]
An analytic center cutting plane method for semidefinite feasibility problems
Semidefinite feasibility problems arise in many areas of operations research. The abstract form of these problems can be described as finding a point in a nonempty bounded convex body Gamma in the cone of symmetric positive semidefinite matrices. Assume that Gamma is defined by an oracle, which for any given m * m symmetric positive semidefinite matrix Gamma either confirms that Y epsilon Gamma or returns a cut, i.e., a symmetric matrix A such that Gamma is in the half-space {Y : A . Y <or= A . Y}. We study an analytic center cutting plane algorithm for this problem. At each iteration, the algorithm computes an approximate analytic center of a working set defined by the cutting plane system generated in the previous iterations. If this approximate analytic center is a solution, then the algorithm terminates; otherwise the new cutting plane returned by the oracle is added into the system. As the number of iterations increases, the working set shrinks and the algorithm eventually finds a solution to the problem. All iterates generated by the algorithm are positive definite matrices. The algorithm has a worst-case complexity of O*(m/sup 3// epsilon /sup 2/) on the total number of cuts to be used, where epsilon is the maximum radius of a ball contained by Gamma
[ "analytic center cutting plane method", "semidefinite feasibility problems", "operations research", "nonempty bounded convex body", "symmetric positive semidefinite matrices", "oracle", "iteration", "approximate analytic center", "working set", "worst-case complexity", "maximum ball radius" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
A summary of methods applied to tool condition monitoring in drilling
Presents a summary of the monitoring methods, signal analysis and diagnostic techniques for tool wear and failure monitoring in drilling that have been tested and reported in the literature. The paper covers only indirect monitoring methods such as force, vibration and current measurements. Signal analysis techniques cover all the methods that have been used with indirect measurements including e.g. statistical parameters and Fast Fourier and Wavelet Transform. Only a limited number of automatic diagnostic tools have been developed for diagnosis of the condition of the tool in drilling. All of these rather diverse approaches that have been available are covered in this study. Only in a few of the papers have attempts been made to compare the chosen approach with other methods. Many of the papers only present one approach and unfortunately quite often the test material of the study is limited especially in what comes to the cutting process parameter variation and also workpiece material
[ "tool condition monitoring", "drilling", "monitoring methods", "signal analysis", "diagnostic techniques", "tool wear", "failure monitoring", "indirect monitoring methods", "current measurements", "statistical parameters", "wavelet transform", "automatic diagnostic tools", "force measurements", "vibration measurements", "fast Fourier transform" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
Sensing and control of double-sided arc welding process
The welding industry is driven to improve productivity without sacrificing quality. For thick material welding, the current practice is to use backing or multiple passes. The laser welding process, capable of achieving deep narrow penetration, can significantly improve welding productivity for such applications by reducing the number of passes. However, its competitiveness in comparison with traditional arc welding is weakened by its high cost, strict fit-up requirement, and difficulty in welding large structures. In this work, a different method, referred to as double-sided arc welding (DSAW) is developed to improve the arc concentration for arc welding. A sensing and control system is developed to achieve deep narrow penetration under variations in welding conditions. Experiments verified that the pulsed keyhole DSAW system developed is capable of achieving deep narrow penetration on a 1/2 inch thick square butt joint in a single pass
[ "double-sided arc welding", "thick material welding", "laser welding process", "control system", "process control", "energy density", "controlled pulse keyhole" ]
[ "P", "P", "P", "P", "R", "U", "R" ]
SBC gets more serious on regulatory compliance
With one eye on the past and the other on its future, SBC Communications last week created a unit it hopes will bring a cohesiveness and efficiency to its regulatory compliance efforts that previously had been lacking. The carrier also hopes the new regulatory compliance unit will help it accomplish its short-term goal of landing FCC approval. to provide long-distance service throughout its region, and its longer-term, goal of reducing the regulatory burdens under which it and currently operate
[ "regulatory compliance", "SBC Communications", "telecom carrier" ]
[ "P", "P", "M" ]
Computer program for calculating the p-value in testing process capability index C/sub pmk/
Many process capability indices, including C/sub p/, C/sub pk/, and C/sub pm/, have been proposed to provide numerical measures on the process potential and performance. Combining the advantages of these indices, Pearn et al. (1992) introduced a new capability index called C/sub pmk/, which has been shown to be a useful capability index for processes with two-sided specification limits. In this paper, the authors implement the theory of a testing hypothesis using the natural estimator of C/sub pmk/, and provide an efficient Maple computer program to calculate the p-values. They also provide tables of the critical values for some commonly used capability requirements. Based on the test, they develop a simple step-by-step procedure for in-plant applications. The practitioners can use the proposed procedure to determine whether their process meets the preset capability requirement, and make reliable decisions
[ "computer program", "testing process capability index", "process potential", "testing hypothesis", "natural estimator", "Maple", "in-plant applications", "preset capability requirement", "reliable decisions", "process performance", "p-value calculation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
New kit on the block [IT upgrades]
As time passes, new hardware and software replace the old. The hows are straightforward: IT resellers and consultants can help with upgrade practicalities. Will Dalrymple examines the business issues and costs involved in IT upgrades
[ "IT upgrades", "IT resellers", "consultants", "business issues", "costs", "Microsoft" ]
[ "P", "P", "P", "P", "P", "U" ]
Evaluating alternative manufacturing control strategies using a benchmark system
This paper describes an investigation of the effects of dynamic job routing and job sequencing decisions on the performance of a distributed control system and its adaptability against disturbances. This experimental work was carried out to compare the performance of alternative control strategies in various manufacturing environments and to investigate the relationship between the 'control' and 'controlled' systems. The experimental test-bed presented in this paper consists of an agent-based control system (implemented in C++) and a discrete-event simulation model. Using this test-bed, various control strategies were tested on a benchmark manufacturing system by varying production volumes (to model the production system with looser/tighter schedules) and disturbance frequencies. It was found that hybrid strategies that combine reactive agent mechanisms (and allocation strategies such as the contract net) with appropriate job sequencing heuristics provide the best performance, particularly when job congestion increases on a shop-floor
[ "alternative manufacturing control strategies", "benchmark system", "dynamic job routing", "job sequencing decisions", "distributed control system", "experimental test-bed", "agent-based control system", "discrete-event simulation model", "benchmark manufacturing system", "production volumes", "disturbance frequencies", "hybrid strategies", "reactive agent mechanisms", "allocation strategies", "contract net", "job congestion", "disturbance adaptability" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
International swinging: making Swing components locale-sensitive
Although Java and its GUI library Swing provide software developers with a highly customizable framework for creating truly "international" applications, the Swing library is not sensitive to locale switches: it cannot automatically change an application's appearance to conform to the conventions of a specific locale at run time. Several types of applications benefit from the ability to easily switch the language at run time. Training applications and other programs that run on computers in public spaces (such as libraries, airports, or government offices) may need to support multiple languages. Other applications (like travel dictionaries or translation programs) are inherently multilingual and are specifically designed to support users of dissimilar tongues. Such applications would greatly benefit if the user-interface language could be customized at run time. The article shows you how to customize Swing to support locale switching at run time. The author has created a new look-and-feel called the MLMetalLookandFeel (where ML stands for multilingual). This new look-and-feel extends the standard Metal look-and-feel but is locale-sensitive at run time
[ "Java", "GUI library", "Swing library", "locale switching", "travel dictionaries", "translation programs", "user-interface language", "MLMetalLookandFeel" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
Web talk is cheap
Web technology provides a wealth of opportunities for reaching potential customers. So how do you make it work for your business?
[ "website", "web chat", "collaborative browsing", "customer service representative", "call centre" ]
[ "U", "M", "U", "M", "U" ]
A framework for image deblurring using wavelet packet bases
We show that the average over translations of an operator diagonal in a wavelet packet basis is a convolution. We also show that an operator diagonal in a wavelet packet basis can be decomposed into several operators of the same kind, each of them being better conditioned. We investigate the possibility of using such a convolution to approximate a given convolution (in practice an image blur). Then we use these approximations to deblur images. First, we show that this framework permits us to redefine existing deblurring methods. Then, we show that it permits us to define a new variational method which combines the wavelet packet and the total variation approaches. We argue and show by experiments that this permits us to avoid the drawbacks of both approaches which are, respectively, ringing and staircasing
[ "image deblurring", "wavelet packet bases", "operator diagonal", "convolution", "total variation approach", "ringing", "staircasing", "deconvolution" ]
[ "P", "P", "P", "P", "P", "P", "P", "U" ]
Unlocking the clubhouse: the Carnegie Mellon experience
In the fall of 1995, just seven of 95 students entering the undergraduate program in computer science at Carnegie Mellon University were women. In 2000, 54 of 130, or 42%, were women. What happened? This article presents a brief history of the transformation at Carnegie Mellon's School of Computer Science, and the research project that lay behind it
[ "students", "undergraduate program", "Carnegie Mellon University", "women", "history", "research project", "computer science education", "gender issues" ]
[ "P", "P", "P", "P", "P", "P", "M", "U" ]
Efficient cellular automata based versatile multiplier for GF(2/sup m/)
In this paper, a low-complexity programmable cellular automata (PCA) based versatile modular multiplier in GF(2/sup m/) is presented. The proposed versatile multiplier increases flexibility by using the same multiplier in different security environments, and it reduces the user's cost. Moreover, the multiplier can be easily extended to high order of m for more security, and low-cost serial implementation is feasible in restricted computing environments, such as smart cards and wireless devices
[ "cellular automata based versatile multiplier", "low-complexity programmable cellular automata", "security environments", "restricted computing environments", "smart cards", "wireless devices" ]
[ "P", "P", "P", "P", "P", "P" ]
An attack-finding algorithm for security protocols
This paper proposes an automatic attack construction algorithm in order to find potential attacks on security protocols. It is based on a dynamic strand space model, which enhances the original strand space model by introducing active nodes on strands so as to characterize the dynamic procedure of protocol execution. With exact causal dependency relations between messages considered in the model, this algorithm can avoid state space explosion caused by asynchronous composition. In order to get a finite state space, a new method called strand-added on demand is exploited, which extends a bundle in an incremental manner without requiring explicit configuration of protocol execution parameters. A finer granularity model of term structure is also introduced, in which subterms are divided into check subterms and data subterms. Moreover, data subterms can be further classified based on the compatible data subterm relation to obtain automatically the finite set of valid acceptable terms for an honest principal. In this algorithm, terms core is designed to represent the intruder's knowledge compactly, and forward search technology is used to simulate attack patterns easily. Using this algorithm, a new attack on the Dolve-Yao protocol can be found, which is even more harmful because the secret is revealed before the session terminates
[ "attack-finding algorithm", "security protocols", "dynamic strand space model", "strand space model", "state space explosion", "asynchronous composition", "strand-added on demand", "check subterms", "data subterms", "Dolve-Yao protocol" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
Incorporating multi-leaf collimator leaf sequencing into iterative IMRT optimization
Intensity modulated radiation therapy (IMRT) treatment planning typically considers beam optimization and beam delivery as separate tasks. Following optimization, a multi-leaf collimator (MLC) or other beam delivery device is used to generate fluence patterns for patient treatment delivery. Due to limitations and characteristics of the MLC, the deliverable intensity distributions often differ from those produced by the optimizer, leading to differences between the delivered and the optimized doses. Objective function parameters are then adjusted empirically, and the plan is reoptimized to achieve a desired deliverable dose distribution. The resulting plan, though usually acceptable, may not be the best achievable. A method has been developed to incorporate the MLC restrictions into the optimization process. Our in-house IMRT system has been modified to include the calculation of the deliverable intensity into the optimizer. In this process, prior to dose calculation, the MLC leaf sequencer is used to convert intensities to dynamic MLC sequences, from which the deliverable intensities are then determined. All other optimization steps remain the same. To evaluate the effectiveness of deliverable-based optimization, 17 patient cases have been studied. Compared with standard optimization plus conversion to deliverable beams, deliverable-based optimization results show improved isodose coverage and a reduced dose to critical structures. Deliverable-based optimization results are close to the original nondeliverable optimization results, suggesting that IMRT can overcome the MLC limitations by adjusting individual beamlets. The use of deliverable-based optimization may reduce the need for empirical adjustment of objective function parameters and reoptimization of a plan to achieve desired results
[ "intensity modulated radiation therapy", "treatment planning", "beam optimization", "beam delivery", "fluence patterns", "objective function parameters", "deliverable dose distribution", "empirical adjustment", "iterative optimization", "multileaf collimator leaf sequencing", "tumor dose", "optimized intensity", "gradient-based search algorithm", "beamlet ray intensities", "Newton method", "dose-volume objective values" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "M", "R", "U", "M", "M", "M" ]
Vehicle travel time models for AGV systems under various dispatching rules
The design and evaluation of AGV-based material handling systems are highly complex because of the randomness and the large number of variables involved. Vehicle travel time is a fundamental parameter for solving various flexible manufacturing system (FMS) design problems. This article presents stochastic vehicle travel time models for AGV-based material handling systems with emphasis on the empty travel times of vehicles. Various vehicle dispatching rules examined here include the nearest vehicle selection rule and longest idle vehicle selection rule. A simulation experiment is used to evaluate and demonstrate the presented models
[ "vehicle travel time", "AGV", "material handling systems", "flexible manufacturing system", "FMS", "vehicle dispatching rules", "nearest vehicle selection rule", "longest idle vehicle selection rule", "automatic guided vehicle" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
The treatment of fear of flying: a controlled study of imaginal and virtual reality graded exposure therapy
The goal of this study was to determine if virtual reality graded exposure therapy (VRGET) was equally efficacious, more efficacious, or less efficacious, than imaginal exposure therapy in the treatment of fear of flying. Thirty participants (Age=39.8+or-9.7) with confirmed DSM-IV diagnosis of specific phobia fear of flying were randomly assigned to one of three groups: VRGET with no physiological feedback (VRGETno), VRGET with physiological feedback (VRGETpm), or systematic desensitization with imaginal exposure therapy (IET). Eight sessions were conducted once a week. During each session, physiology was measured to give an objective measurement of improvement over the course of exposure therapy. In addition, self-report questionnaires, subjective ratings of anxiety (SUDs), and behavioral observations (included here as flying behavior before beginning treatment and at a three-month posttreatment followup) were included. In the analysis of results, the Chi-square test of behavioral observations based on a three-month posttreatment followup revealed a statistically significant difference in flying behavior between the groups [ chi /sup 2/(4)=19.41, p<0.001]. Only one participant (10%) who received IET, eight of the ten participants (80%) who received VRGETno, and ten out of the ten participants (100%) who received VRGETpm reported an ability to fly without medication or alcohol at three-month followup. Although this study included small sample sizes for the three groups, the results showed VRGET was more effective than IET in the treatment of flying. It also suggests that physiological feedback may add to the efficacy of VR treatment
[ "virtual reality graded exposure therapy", "imaginal exposure therapy", "phobia", "physiological feedback", "physiology", "questionnaires", "subjective ratings of anxiety", "behavioral observations", "Chi-square test", "flying fear", "patient treatment" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M" ]
End of preview (truncated to 100 rows)

Inspec Benchmark Dataset for Keyphrase Generation


Inspec is a dataset for benchmarking keyphrase extraction and generation models. The dataset is composed of 2,000 abstracts of scientific papers collected from the Inspec database. Keyphrases were annotated by professional indexers in an uncontrolled setting (that is, not limited to thesaurus entries). Details about the inspec dataset can be found in the original paper (Hulth, 2003).

Reference (indexer-assigned) keyphrases are also categorized under the PRMU (Present-Reordered-Mixed-Unseen) scheme as proposed in (Boudin and Gallina, 2021).

Text pre-processing (tokenization) is carried out using spacy (en_core_web_sm model) with a special rule to avoid splitting words with hyphens (e.g. graph-based is kept as one token). Stemming (Porter's stemmer implementation provided in nltk) is applied before reference keyphrases are matched against the source text. Details about the process can be found in

Content and statistics

The dataset is divided into the following three splits:

Split # documents #words # keyphrases % Present % Reordered % Mixed % Unseen
Train 1,000 141.7 9.79 78.00 9.85 6.22 5.93
Validation 500 132.2 9.15 77.96 9.82 6.75 5.47
Test 500 134.8 9.83 78.70 9.92 6.48 4.91

The following data fields are available :

  • id: unique identifier of the document.
  • title: title of the document.
  • abstract: abstract of the document.
  • keyphrases: list of reference keyphrases.
  • prmu: list of Present-Reordered-Mixed-Unseen categories for reference keyphrases.


Edit dataset card