id
stringlengths
1
4
title
stringlengths
13
200
abstract
stringlengths
67
2.93k
keyphrases
sequence
prmu
sequence
1342
Defending against flooding-based distributed denial-of-service attacks: a tutorial
Flooding-based distributed denial-of-service (DDoS) attack presents a very serious threat to the stability of the Internet. In a typical DDoS attack, a large number of compromised hosts are amassed to send useless packets to jam a victim, or its Internet connection, or both. In the last two years, it was discovered that DDoS attack methods and tools are becoming more sophisticated, effective, and also more difficult to trace to the real attackers. On the defense side, current technologies are still unable to withstand large-scale attacks. The main purpose of this article is therefore twofold. The first one is to describe various DDoS attack methods, and to present a systematic review and evaluation of the existing defense mechanisms. The second is to discuss a longer-term solution, dubbed the Internet-firewall approach, that attempts to intercept attack packets in the Internet core, well before reaching the victim
[ "flooding-based distributed denial-of-service attacks", "tutorial", "DDoS attack methods", "large-scale attacks", "Internet stability", "DDoS attack tools", "Internet firewall", "attack packets interception", "reflector attacks", "distributed attack detection" ]
[ "P", "P", "P", "P", "R", "R", "M", "R", "M", "M" ]
717
A network simplex algorithm with O(n) consecutive degenerate pivots
We suggest a pivot rule for the primal simplex algorithm for the minimum cost flow problem, known as the network simplex algorithm. Due to degeneracy, cycling may occur in the network simplex algorithm. The cycling can be prevented by maintaining strongly feasible bases proposed by Cunningham (1976); however, if we do not impose any restrictions on the entering variables, the algorithm can still perform an exponentially long sequence of degenerate pivots. This phenomenon is known as stalling. Researchers have suggested several pivot rules with the following bounds on the number of consecutive degenerate pivots: m, n/sup 2/, k(k + 1)/2, where n is the number of nodes in the network, m is the number of arcs in the network, and k is the number of degenerate arcs in the basis. (Observe that k <or= n.) In this paper, we describe an anti-stalling pivot rule that ensures that the network simplex algorithm performs at most k consecutive degenerate pivots. This rule uses a negative cost augmenting cycle to identify a sequence of entering variables
[ "network simplex algorithm", "degenerate pivots", "minimum cost flow problem", "degeneracy", "cycling", "stalling", "anti-stalling pivot rule", "negative cost augmenting cycle" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
752
Presenting-a better mousetrap [Leeza outboard video signal processor]
Scaling interlaced video to match high-resolution plasma, LCD, and DLP displays is a tough job, but Key Digital's Leeza is zip to the tack. And it's digitally bilingual, too. There's no question that outboard video signal processors like Leeza help overcome the inherent limitations of fixed-pixel displays. Being able to match a native display rate with heavily processed video makes the viewing experience much more enjoyable. But it seemed that 70% of the improvement in image quality came from using a digital interface to the DVD player, as most noise and picture artifacts are introduced in the analog video encoding process
[ "Leeza", "outboard video signal processors", "DLP displays", "fixed-pixel displays", "heavily processed video", "LCD displays", "plasma displays" ]
[ "P", "P", "P", "P", "P", "R", "R" ]
879
Well behaved women rarely make history!
The author considers women in the history of computer science. Prior to the ENIAC, women were extremely important to the computing business as "computers". Just as women had taken over the tasks as secretaries in the late 1800s with the advent of the typewriter, and in the early 1900s staffing telephone exchanges, so computing relied on women as the "workhorses" of the business
[ "women", "history", "computer science", "ENIAC", "business", "gender issues" ]
[ "P", "P", "P", "P", "P", "U" ]
1006
Robust model-order reduction of complex biological processes
This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale biological nutrient removal (BNR) activated sludge plant, we developed a state-space model with 154 state variables. A general algorithm for robustly reducing the nonlinear PDE model is presented and, based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications
[ "robust model-order reduction", "complex biological processes", "state-space model", "modelling errors", "singular perturbation approximation balanced truncating technique", "controller design", "high dimensional nonlinear partial differential equation model", "nonlinear distributed parameter model", "pilot-scale BNR activated sludge plant", "Hankel singular values", "biological nutrient removal activated sludge processes" ]
[ "P", "P", "P", "P", "P", "P", "R", "R", "R", "M", "R" ]
1043
Fractional motion control: application to an XY cutting table
In path tracking design, the dynamic of actuators must be taken into account in order to reduce overshoots appearing for small displacements. A new approach to path tracking using fractional differentiation is proposed with its application on a XY cutting table. It permits the generation of optimal movement reference-input leading to a minimum path completion time, taking into account both maximum velocity, acceleration and torque and the bandwidth of the closed-loop system. Fractional differentiation is used here through a Davidson-Cole filter. A methodology aiming at improving the accuracy especially on checkpoints is presented. The reference-input obtained is compared with spline function. Both are applied to an XY cutting table model and actuator outputs compared
[ "fractional motion control", "XY cutting table", "path tracking design", "actuators", "fractional differentiation", "optimization", "minimum path completion time", "closed-loop system", "Davidson-Cole filter", "spline function" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
884
A hybrid-neural network and population learning algorithm approach to solving reliability optimization problem
Proposes a hybrid approach integrating a dedicated artificial neural network and population learning algorithm applied to maximising system reliability under cost and technical feasibility constraints. The paper includes a formulation of the system reliability optimisation (SRO) problem and a description of the dedicated neural network trained by applying the population learning algorithm. A solution to the example SRO problem is shown and results of the computational experiment are presented and discussed
[ "population learning algorithm", "reliability optimization problem", "hybrid approach", "dedicated artificial neural network", "system reliability", "technical feasibility constraints", "cost constraints" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1187
Ethernet networks: getting down to business
While it seems pretty clear that Ethernet has won the battle for the mindshare as the network of choice for the factory floor, there's still a war to be won in implementation as cutting-edge manufacturers begin to adopt the technology on a widespread basis
[ "Ethernet", "factory floor", "cutting-edge manufacturers", "supervisory level" ]
[ "P", "P", "P", "U" ]
905
Ultra-high speed positioning control of a gravure engraving unit using a discrete-time two-degree-of-freedom H/sub infinity / control
The piezoelectric actuator has high-speed response in comparison with the electro-magnetic actuator. However, it is not easy to achieve both high-speed and high-precision response by feedforward control only because the piezoelectric element has nonlinear properties such as the hysteresis effect. Thus, feedback control is required to achieve good performance. We develop a control design method to achieve both high-speed and high-precision response for piezoelectric actuators using the discrete-time H/sub infinity / control method and the two-degree-of-freedom control scheme. The effectiveness of our proposed method has been shown by simulation and experimental results. The most important contribution of our study is that our method can be directly applied to commercial machines
[ "ultra-high speed positioning control", "gravure engraving unit", "discrete-time two-degree-of-freedom H/sub infinity / control", "piezoelectric actuator", "high-precision response", "feedforward", "nonlinear properties", "hysteresis", "feedback control", "control design method", "digital control system" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
146
Design patterns for high availability
It is possible to achieve five-nines reliability with everyday commercial-quality hardware and software. The key is the way in which these components are combined. The design of high availability systems is based on a combination of redundant hardware components and software to manage fault detection and correction without human intervention. The author quickly reviews some definitions tied to high availability and fault management, and then goes on to discuss some hardware and software design patterns for fault tolerant systems
[ "high availability systems", "redundant hardware components", "fault detection", "software design patterns", "fault tolerant systems", "hardware reliability", "software reliability", "fault correction", "checkpointing", "software redundancy" ]
[ "P", "P", "P", "P", "P", "R", "R", "R", "U", "R" ]
597
Quick media response averts PR disaster
Sometimes it's not what you do, but how you do it. After hackers broke the blocking code on the home version of its popular Cyber Patrol Internet filtering software and posted it on the Internet, marketers at Microsystems Software pulled out a playbook of standard crisis management and PR techniques. But the Cyber Patrol PR team including outside PR counsel and the company's outside law firm, used those tools aggressively in order to turn the tide of public and media opinion away from the hackers, who initially were hailed as folk heroes, and in favor of the company's interests, to save the product's and the company's reputations and inherent value. And the entire team managed to move at Internet speed: The crisis was essentially over in about three weeks
[ "media response", "Cyber Patrol Internet filtering software", "Microsystems Software", "crisis management", "public relations" ]
[ "P", "P", "P", "P", "M" ]
940
Tools for the analysis of dose optimization. I. Effect-volume histogram
With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level
[ "effect-volume histogram", "dose optimization algorithms", "intensity-modulated radiotherapy", "computer software", "dosimetric conflicts", "treatment goals", "decision-making process", "partial volumes", "low doses", "treatment risks", "high doses", "volume element weights", "treatment planning", "objective function", "insufficient target coverage", "exponential law", "cell survival", "one-sided quadratic penalties", "quadratic overdose penalty" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "M", "R", "U", "U", "U", "U", "U" ]
14
Application of hybrid models for prediction and optimization of enzyme fermentation process. A comparative study
The paper presents a comparison of the biotechnological process prediction and optimization results obtained by using different structure hybrid mathematical models for modeling of the same bioprocess. The hybrid models under investigation consist of the product mass balance equation in which different means - an artificial neural network, fuzzy-neural network and cell age distribution based calculation scheme - are incorporated for modeling the specific biosynthesis rate of a desired product. Experimental data from alpha -amylase laboratory and industrial fermentation processes are used for model parameter identification and the process prediction tests
[ "hybrid models", "optimization", "enzyme fermentation", "mathematical models", "bioprocess", "product mass balance equation", "fuzzy-neural network", "cell age distribution", "biosynthesis rate", "identification", "industrial processes" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
1286
Self-describing Turing machines
After a sketchy historical account on the question of self-describeness and self-reproduction, and after discussing the definition of suitable encodings for self-describeness, we give the construction of several self-describing Turing machines, namely self-describing machines with, respectively, 350, 267, 224 and 206 instructions
[ "self-describing Turing machines", "self-describeness", "self-reproduction", "encodings" ]
[ "P", "P", "P", "P" ]
696
Design and implementation of a new sliding-mode observer for speed-sensorless control of induction machine
In this letter, a new sliding-mode-sensorless control algorithm is proposed for the field-oriented induction machine drive. In the proposed algorithm, the terms containing flux, speed, and rotor time constant, which are common in both current and flux equations, in the current model of the induction machine are estimated by a sliding function. The flux and speed estimation accuracy is guaranteed when the error between the actual current and observed current converges to zero. Hence, the fourth-order system is reduced to two second-order systems, and the speed estimation becomes very simple and robust to the parameter uncertainties. The new approach is verified by simulation and experimental results
[ "sliding-mode observer", "speed-sensorless control", "speed", "induction machine", "flux", "rotor time constant", "flux equations", "current model", "sliding function", "speed estimation accuracy", "parameter uncertainties", "induction motor drive", "sensorless control", "current equations", "fourth-order system reduction" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "R", "M" ]
1022
Bad pixel identification by means of principal components analysis
Bad pixels are defined as those pixels showing a temporal evolution of the signal different from the rest of the pixels of a given array. Principal component analysis helps us to understand the definition of a statistical distance associated with each pixels, and using this distance it is possible to identify those pixels labeled as bad pixels. The spatiality of a pixel is also calculated. An assumption about the normality of the distribution of the distances of the pixels is revised. Although the influence on the robustness of the identification algorithm is negligible, the definition of a parameter related with this nonnormality helps to identify those principal components and eigenimages responsible for the departure from a multinormal distribution. The method for identifying the bad pixels is successfully applied to a set of frames obtained from a CCD visible and a focal plane array (FPA) IR camera
[ "bad pixel identification", "principal components analysis", "temporal evolution", "statistical distance", "robustness", "identification algorithm", "eigenimages", "multinormal distribution", "focal plane array", "IR camera", "CCD visible camera" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
1067
Quantum-information processing by nuclear magnetic resonance: Experimental implementation of half-adder and subtractor operations using an oriented spin-7/2 system
The advantages of using quantum systems for performing many computational tasks have already been established. Several quantum algorithms have been developed which exploit the inherent property of quantum systems such as superposition of states and entanglement for efficiently performing certain tasks. The experimental implementation has been achieved on many quantum systems, of which nuclear magnetic resonance has shown the largest progress in terms of number of qubits. This paper describes the use of a spin-7/2 as a three-qubit system and experimentally implements the half-adder and subtractor operations. The required qubits are realized by partially orienting /sup 133/Cs nuclei in a liquid-crystalline medium, yielding a quadrupolar split well-resolved septet. Another feature of this paper is the proposal that labeling of quantum states of system can be suitably chosen to increase the efficiency of a computational task
[ "quantum-information processing", "nuclear magnetic resonance", "subtractor operations", "oriented spin-7/2 system", "quantum systems", "computational tasks", "computational tasks", "quantum algorithms", "entanglement", "qubits", "three-qubit system", "/sup 133/Cs nuclei", "/sup 133/Cs", "liquid-crystalline medium", "quadrupolar split well-resolved septet", "quantum states", "half-adder operations", "state superposition", "computational task" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P" ]
818
Clausal resolution in a logic of rational agency
A resolution based proof system for a Temporal Logic of Possible Belief is presented. This logic is the combination of the branching-time temporal logic CTL (representing change over time) with the modal logic KD45 (representing belief). Such combinations of temporal or dynamic logics and modal logics are useful for specifying complex properties of multi-agent systems. Proof methods are important for developing verification techniques for these complex multi-modal logics. Soundness, completeness and termination of the proof method are shown and simple examples illustrating its use are given
[ "resolution based proof system", "temporal logic", "belief", "branching-time temporal logic", "CTL", "modal logic", "KD45", "dynamic logics", "multi-agent systems", "multi-modal logics", "rational agents" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
1323
Editorial system vendors focus on Adobe and the future
Looking over the newspaper-system market, we note that the Mac is getting new respect. Adobe InDesign has established itself as a solid alternative to Quark XPress for pagination. Positioning themselves for the long run, developers are gradually shifting to new software architectures
[ "newspaper-system market", "Adobe InDesign", "pagination", "Macintosh", "publishing" ]
[ "P", "P", "P", "U", "U" ]
1366
A fuzzy-soft learning vector quantization for control chart pattern recognition
This paper presents a supervised competitive learning network approach, called a fuzzy-soft learning vector quantization, for control chart pattern recognition. Unnatural patterns in control charts mean that there are some unnatural causes for variations in statistical process control (SPC). Hence, control chart pattern recognition becomes more important in SPC. In order to detect effectively the patterns for the six main types of control charts, Pham and Oztemel (1994) described a class of pattern recognizers for control charts based on the learning vector quantization (LVQ) such as LVQ, LVQ2 and LVQ-X etc. In this paper, we propose a new supervised LVQ for control charts based on a fuzzy-soft competitive learning network. The proposed fuzzy-soft LVQ (FS-LVQ) uses a fuzzy relaxation technique and simultaneously updates all neurons. It can increase correct recognition accuracy and also decrease the learning time. Comparisons between LVQ, LVQ-X and FS-LVQ are made
[ "fuzzy-soft learning vector quantization", "control chart pattern recognition", "supervised competitive learning network approach", "unnatural patterns", "statistical process control", "SPC", "supervised LVQ", "fuzzy relaxation technique", "correct recognition accuracy", "learning time", "simultaneous neuron update", "numerical results", "manufacturing process" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "M" ]
733
VoIP makeover transforms ugly duckling network
Surrey County Council's Swan project is Europe's biggest implementation of voice over IP. Six Wans and countless Lans are are being consolidated into a single network covering 6,000 users at 200 sites. The contract was signed in October 2001 for Pounds 13m over five years and rollout will be completed in May 2003
[ "Surrey County Council", "Swan", "WAN", "voice over IP", "LAN" ]
[ "P", "P", "P", "P", "P" ]
776
Information access for all: meeting the needs of deaf and hard of hearing people
Discusses the nature of deafness and hearing impairments, with particular reference to the impact which the onset of hearing loss presents at various ages. The author goes on to present practical tips for interacting with deaf and hard of hearing clients in various communication contexts, including sightreading, TTY communications, and ASL interpreters. An annotated list of suggested readings is appended
[ "information access", "deafness", "hearing impairments", "hard of hearing clients", "communication contexts", "sightreading", "TTY communications", "ASL interpreters", "deaf clients" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
860
'Virtual Family': an approach to introducing Java programming
This paper introduces and discusses Virtual Family (VF): a gender-neutral game-based software that introduces Java programming. VF provides a completely functioning game that students extend and enhance via programming. We discuss the background and context within which Virtual Family was developed and other available multimedia resources for teaching programming. The paper then goes on to describe Virtual Family's concept and design. Finally, feedback received from Virtual Family teaching workshops is related, as well as preliminary results from using VF in high-school teaching units. Virtual Family is under development in a research lab at the University of British Columbia and is an initiative of Supporting Women in Information Technology (SWIFT). SWIFT is a five-year research action and implementation project to increase the participation of women in information technology
[ "Virtual Family", "gender-neutral game-based software", "multimedia resources", "teaching workshops", "high-school teaching units", "Supporting Women in Information Technology", "Java programming teaching" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
825
Genetic algorithm-neural network estimation of Cobb angle from torso asymmetry in scoliosis
Scoliosis severity, measured by the Cobb angle, was estimated by artificial neural network from indices of torso surface asymmetry using a genetic algorithm to select the optimal set of input torso indices. Estimates of the Cobb angle were accurate within 5 degrees in two-thirds, and within 10 degrees in six-sevenths, of a test set of 115 scans of 48 scoliosis patients, showing promise for future longitudinal studies to detect scoliosis progression without use of X-rays
[ "genetic algorithm", "Cobb angle", "artificial neural network", "torso surface asymmetry", "input torso indices", "scoliosis patients", "scoliosis progression" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
1433
The role and future of subject classification: the exploitation of resources
It is imperative that the library information systems (LIS) profession and LIS educators appreciate fully the contribution that classification makes to the discipline and that it is no longer seen as the domain of the academic, isolated theorist, but becomes an integral part of our understanding of the contribution that the LIS community can make to society as a whole - as well as to particular areas such as legal information
[ "subject classification", "library information systems", "LIS", "legal information", "information resources" ]
[ "P", "P", "P", "P", "R" ]
91
IT challenge: cross selling [finance]
Like most financial institutions, FleetBoston, Fidelity and Berkshire Group of Companies are being charged with developing a strong technology platform that will allow them to cross sell their products and services. They discuss their solutions, advice and technology choices
[ "cross selling", "financial institutions", "FleetBoston", "Fidelity", "Berkshire Group" ]
[ "P", "P", "P", "P", "P" ]
557
Noise and the PSTH response to current transients: II. Integrate-and-fire model with slow recovery and application to motoneuron data
For pt.I see ibid., vol.11, no.2 , p.135-151( 2001). A generalized version of the integrate-and-fire model is presented that qualitatively reproduces firing rates and membrane trajectories of motoneurons. The description is based on the spike-response model and includes three different time constants: the passive membrane time constant, a recovery time of the input conductance after each spike, and a time constant of the spike afterpotential. The effect of stochastic background input on the peristimulus time histogram (PSTH) response to spike input is calculated analytically. Model results are compared with the experimental data of Poliakov et al. (1996). The linearized theory shows that the PSTH response to an input spike is proportional to a filtered version of the postsynaptic potential generated by the input spike. The shape of the filter depends on the background activity. The full nonlinear theory is in close agreement with simulated PSTH data
[ "PSTH", "integrate-and-fire model", "motoneuron", "firing rates", "membrane trajectories", "spike-response model", "passive membrane time constant", "recovery time", "spike afterpotential" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
980
Convergence of Runge-Kutta methods for nonlinear parabolic equations
We study time discretizations of fully nonlinear parabolic differential equations. Our analysis uses the fact that the linearization along the exact solution is a uniformly sectorial operator. We derive smooth and nonsmooth-data error estimates for the backward Euler method, and we prove convergence for strongly A (v)-stable Runge-Kutta methods. For the latter, the order of convergence for smooth solutions is essentially determined by the stage order of the method. Numerical examples illustrating the convergence estimates are presented
[ "linearization", "time discretizations", "nonlinear parabolic differential equations", "uniformly sectorial operator", "nonsmooth-data error estimates", "backward Euler method", "Runge-Kutta method convergence", "data error estimates" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
1147
Angular disparity in ETACT scintimammography
Emission tuned aperture computed tomography (ETACT) has been previously shown to have the potential for the detection of small tumors (<1 cm) in scintimammography. However, the optimal approach to the application of ETACT in the clinic has yet to be determined. Therefore, we sought to determine the effect of the angular disparity between the ETACT projections on image quality through the use of a computer simulation. A small, spherical tumor of variable size (5, 7.5 or 10 mm) was placed at the center of a hemispherical breast (15 cm diameter). The tumor to nontumor ratio was either 5:1 or 10:1. The detector was modeled to be a gamma camera fitted with a 4-mm-diam pinhole collimator. The pinhole-to-detector and the pinhole-to-tumor distances were 25 and 15 cm, respectively. A ray tracing technique was used to generate three sets of projections (10 degrees , 15 degrees , and 20 degrees , angular disparity). These data were blurred to a resolution consistent with the 4 mm pinhole. The TACT reconstruction method was used to reconstruct these three image sets. The tumor contrast and the axial spatial resolution was measured. Smaller angular disparity led to an improvement in image contrast but at a cost of degraded axial spatial resolution. The improvement in contrast is due to a slight improvement in the in-plane spatial resolution. Since improved contrast should lead to better tumor detectability, smaller angular disparity should be used. However, the difference in contrast between 10 degrees and 15 degrees was very slight and therefore a reasonable clinical choice for angular disparity is 15 degrees
[ "angular disparity", "small tumors", "image quality", "computer simulation", "spherical tumor", "hemispherical breast", "gamma camera", "pinhole collimator", "pinhole-to-tumor distances", "ray tracing technique", "image sets", "axial spatial resolution", "in-plane spatial resolution", "clinical choice", "emission tuned aperture computed tomography scintimammography", "pinhole-to-detector distances", "tuned aperture computed tomography reconstruction method" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1102
Design and implementation of a reusable and extensible HL7 encoding/decoding framework
The Health Level Seven (HL7), an international standard for electronic data exchange in all health care environments, enables disparate computer applications to exchange key sets of clinical and administrative information. Above all, it defines the standard HL7 message formats prescribed by the standard encoding rules. In this paper, we propose a flexible, reusable, and extensible HL7 encoding and decoding framework using a message object model (MOM) and message definition repository (MDR). The MOM provides an abstract HL7 message form represented by a group of objects and their relationships. It reflects logical relationships among the standard HL7 message elements such as segments, fields, and components, while enforcing the key structural constraints imposed by the standard. Since the MOM completely eliminates the dependency of the HL7 encoder and decoder on platform-specific data formats, it makes it possible to build the encoder and decoder as reusable standalone software components, enabling the interconnection of arbitrary heterogeneous hospital information systems (HIS) with little effort. Moreover, the MDR, an external database of key definitions for HL7 messages, helps make the encoder and decoder as resilient as possible to future modifications of the standard HL7 message formats. It is also used by the encoder and decoder to perform a well-formedness check for their respective inputs (i.e., HL7 message objects expressed in the MOM and encoded HL7 message strings). Although we implemented a prototype version of the encoder and decoder using JAVA, they can be easily packaged and delivered as standalone components using the standard component frameworks
[ "Health Level Seven", "international standard", "electronic data exchange", "health care environments", "administrative information", "HL7 message formats", "HIS", "message object model", "MOM", "message definition repository", "MDR", "logical relationships", "structural constraints", "standalone software components", "heterogeneous hospital information systems", "external database", "key definitions", "JAVA", "reusable framework", "extensible encoding/decoding framework", "abstract message form", "ActiveX", "JAVABEAN", "CORBA", "clinical information" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "U", "U", "U", "R" ]
938
Fast accurate MEG source localization using a multilayer perceptron trained with real brain noise
Iterative gradient methods such as Levenberg-Marquardt (LM) are in widespread use for source localization from electroencephalographic (EEG) and magnetoencephalographic (MEG) signals. Unfortunately, LM depends sensitively on the initial guess, necessitating repeated runs. This, combined with LM's high per-step cost, makes its computational burden quite high. To reduce this burden, we trained a multilayer perceptron (MLP) as a realtime localizer. We used an analytical model of quasistatic electromagnetic propagation through a spherical head to map randomly chosen dipoles to sensor activities according to the sensor geometry of a 4D Neuroimaging Neuromag-122 MEG system, and trained a MLP to invert this mapping in the absence of noise or in the presence of various sorts of noise such as white Gaussian noise, correlated noise, or real brain noise. A MLP structure was chosen to trade off computation and accuracy. This MLP was trained four times, with each type of noise. We measured the effects of initial guesses on LM performance, which motivated a hybrid MLP-start-LM method, in which the trained MLP initializes LM. We also compared the localization performance of LM, MLPs, and hybrid MLP-start-LMs for realistic brain signals. Trained MLPs are much faster than other methods, while the hybrid MLP-start-LMs are faster and more accurate than fixed-4-start-LM. In particular, the hybrid MLP-start-LM initialized by a MLP trained with the real brain noise dataset is 60 times faster and is comparable in accuracy to random-20-start-LM, and this hybrid system (localization error: 0.28 cm, computation time: 36 ms) shows almost as good performance as optimal-1-start-LM (localization error: 0.23 cm, computation time: 22 ms), which initializes LM with the correct dipole location. MLPs trained with noise perform better than the MLP trained without noise, and the MLP trained with real brain noise is almost as good an initial guesser for LM as the correct dipole location
[ "MEG source localization", "multilayer perceptron", "real brain noise", "iterative gradient methods", "analytical model", "quasistatic electromagnetic propagation", "spherical head", "white Gaussian noise", "correlated noise", "fast accurate localization", "real-time localizer", "computation accuracy", "Levenberg-Marquardt method", "forward model" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R", "R", "M" ]
656
The cataloger's workstation revisited: utilizing Cataloger's Desktop
A few years into the development of Cataloger's Desktop, an electronic cataloging tool aggregator available through the Library of Congress, is an opportune time to assess its impact on cataloging operations. A search for online cataloging tools on the Internet indicates a proliferation of cataloging tool aggregators; which provide access to online documentation related to cataloging practices and procedures. Cataloger's Desktop stands out as a leader among these aggregators. Results of a survey to assess 159 academic ARL and large public libraries' reasons for use or non-use of Cataloger's Desktop highlight the necessity of developing strategies for its successful implementation including training staff, providing documentation, and managing technical issues
[ "cataloger's workstation", "Cataloger's Desktop", "electronic cataloging tool", "cataloging tool aggregators", "online cataloging tools", "Internet", "online documentation", "documentation", "academic ARL", "large public libraries", "managing technical issues", "staff training" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
613
Comparison between discrete STFT and wavelets for the analysis of power quality events
This paper deals with the comparison of signal processing tools for power quality analysis. Two signal processing techniques are considered: the wavelet filters and the discrete short-time Fourier transforms (STFT). Then, examples of the two most frequent disturbances met in the power system are chosen. An adjustable speed drive with a six-pulse converter using EMTP/ATP is designed and normal energizing of utility capacitors is presented . The analysis is tested on a system consisting of 13 buses and is representative of a medium-sized industrial plant. Finally, each kind of electrical disturbance is analyzed with examples representing each tool. A qualitative comparison of results shows the advantages and drawbacks of each signal processing technique applied to power quality analysis
[ "discrete STFT", "wavelets", "power quality events", "signal processing tools", "signal processing techniques", "wavelet filters", "discrete short-time Fourier transforms", "short-time Fourier transforms", "adjustable speed drive", "six-pulse converter", "EMTP/ATP", "utility capacitors", "medium-sized industrial plant", "electrical disturbance" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1246
Why information departments are becoming academic
This article outlines the increasing convergence between academia and business over the last decade or so, and the mutual benefits that this closer association has brought. It also looks at the growing importance of the information profession, suggesting that this is leading to a greater need for specialist skills, as reflected by the rise in academic courses in this area. However, it argues that increasing specialization must not lead to insularity; if information professionals are truly concerned with gaining a competitive advantage, they must not close their minds to the potential benefits of working with external, non specialist, partners. The benefits that business has reaped from academia, it is contended, suggest that this may also be a fruitful avenue for information departments to explore
[ "information departments", "academia", "business", "information profession", "specialist skills", "academic courses", "information science", "universities" ]
[ "P", "P", "P", "P", "P", "P", "M", "U" ]
1203
Technology decisions 2002
The paper looks at the critical hardware, software, and services choices manufacturers are making as they begin to emerge from the recession and position themselves for the future
[ "services choices", "manufacturing industries", "information technology", "management of change", "customer relationship management", "enterprise resource planning" ]
[ "P", "M", "M", "U", "U", "U" ]
1058
Bigger is better: the influence of physical size on aesthetic preference judgments
The hypothesis that the physical size of an object can influence aesthetic preferences was investigated. In a series of four experiments, participants were presented with pairs of abstract stimuli and asked to indicate which member of each pair they preferred. A preference for larger stimuli was found on the majority of trials using various types of stimuli, stimuli of various sizes, and with both adult and 3-year-old participants. This preference pattern was disrupted only when participants had both stimuli that provided a readily accessible alternative source of preference-evoking information and sufficient attentional resources to make their preference judgments
[ "aesthetic preference judgments", "abstract stimuli", "preference pattern", "preference-evoking information", "attentional resources", "physical size influence", "decision making", "preference formation", "judgment cues", "adult participants", "child participants" ]
[ "P", "P", "P", "P", "P", "R", "M", "M", "M", "R", "M" ]
1359
On fuzzy and probabilistic control charts
In this article, different procedures of constructing control charts for linguistic data, based on fuzzy and probability theory, are discussed. Three sets of membership functions, with different degrees of fuzziness, are proposed for fuzzy approaches. A comparison between fuzzy and probability approaches, based on the Average Run Length and samples under control, is conducted for real data. Contrary to the conclusions of Raz and Wang (1990) the choice of degree of fuzziness affected the sensitivity of control charts
[ "probabilistic control charts", "linguistic data", "membership functions", "average run length", "sensitivity", "fuzzy control charts", "control chart construction", "fuzziness degree", "fuzzy subsets", "porcelain products" ]
[ "P", "P", "P", "P", "P", "R", "R", "R", "M", "U" ]
749
Numerical modeling of the flow in stenosed coronary artery. The relationship between main hemodynamic parameters
The severity of coronary arterial stenosis is usually measured by either simple geometrical parameters, such as percent diameter stenosis, or hemodynamically based parameters, such as the fractional flow reserve (FFR) or coronary flow reserve (CFR). The present study aimed to establish a relationship between actual hemodynamic conditions and the parameters that define stenosis severity in the clinical setting. We used a computational model of the blood flow in a vessel with a blunt stenosis and an autoregulated vascular bed to simulate a stenosed blood vessel. A key point in creating realistic simulations is to properly model arterial autoregulation. A constant flow regulation mechanism resulted in CFR and FFR values that were within the physiological range, while a constant wall-shear stress model yielded unrealistic values. The simulation tools developed in the present study may be useful in the clinical assessment of single and multiple stenoses by means of minimally invasive methods
[ "numerical modeling", "hemodynamic parameters", "coronary arterial stenosis", "stenosis severity", "clinical setting", "computational model", "blood flow", "blunt stenosis", "autoregulated vascular bed", "simulation", "stenosed blood vessel", "arterial autoregulation", "constant flow regulation mechanism", "physiological range", "minimally invasive methods", "constant wall shear stress model" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
862
Two efficient algorithms for the generalized maximum balanced flow problem
Minoux (1976) considered the maximum balanced flow problem, i.e. the problem of finding a maximum flow in a two-terminal network N = (V,A) with source s and sink t satisfying the constraint that any arc-flow of N is bounded by a fixed proportion of the total flow value from s to t, where V is vertex set and A is arc set. As a generalization, we focus on the problem of maximizing the total flow value of a generalized flow in N with gains gamma (a) > 0 (a in A) where any arc-flow is bounded by a fixed proportion of the total flow value, where gamma (a)f(a) units arrive at the vertex w for each arc-flow f(a) (a identical to ( upsilon , w) in A) entering vertex upsilon in a generalized flow. Our main results are to propose two polynomial algorithms for this problem. The first algorithm runs in O(mM(n, m, B') log B) time, where B is the maximum absolute value among integral values used by an instance of the problem, and M(n, m, B') denotes the complexity of solving a generalized maximum flow problem in a network with n vertices, and m arcs, and a rational instance expressed with integers between 1 and B'. In the second algorithm, using a parameterized technique, runs in O({M(n, m, B')}/sup 2/) time
[ "generalized maximum balanced flow problem", "two-terminal network", "polynomial algorithms", "parameterized technique" ]
[ "P", "P", "P", "P" ]
827
Williams nears end of Chapter 11 [telecom]
Leucadia National Corp. comes through with a $330 million boost for Williams Communications, which should keep the carrier afloat through the remainder of its bankruptcy
[ "Williams Communications", "bankruptcy", "Leucadia National Corp" ]
[ "P", "P", "M" ]
1431
Cataloguing to help law library users
The author takes a broader view of the catalogue than is usual; we can include within it items that have locations other than the office/library itself. This may well start with Internet resources, but can perfectly appropriately continue with standard works not held in the immediate collection but available in some other accessible collection, such as the local reference library. The essential feature is to include entries for the kind of material sought by users, with the addition of a location mark indicating where they can find it
[ "cataloguing", "law library users", "Internet resources", "reference library", "location mark" ]
[ "P", "P", "P", "P", "P" ]
654
A question of perspective: assigning Library of Congress subject headings to classical literature and ancient history
This article explains the concept of world view and shows how the world view of cataloguers influences the development and assignment of subject headings to works about other cultures and civilizations, using works from classical literature and ancient history as examples. Cataloguers are encouraged to evaluate the headings they assign to works in classical literature and ancient history in terms of the world views of Ancient Greece and Rome so that headings reflect the contents of the works they describe and give fuller expression to the diversity of thoughts and themes that characterize these ancient civilizations
[ "classical literature", "ancient history", "world view", "cultures", "civilizations", "Ancient Greece", "Library of Congress subject heading assignment", "Ancient Rome" ]
[ "P", "P", "P", "P", "P", "P", "R", "R" ]
611
Intelligent optimal sieving method for FACTS device control in multi-machine systems
A multi-target oriented optimal control strategy for FACTS devices installed in multi-machine power systems is presented in this paper, which is named the intelligent optimal sieving control (IOSC) method. This new method divides the FACTS device output region into several parts and selects one typical value from each part, which is called output candidate. Then, an intelligent optimal sieve is constructed, which predicts the impacts of each output candidate on a power system and sieves out an optimal output from all of the candidates. The artificial neural network technologies and fuzzy methods are applied to build the intelligent sieve. Finally, the real control signal of FACTS devices is calculated according to the selected optimal output through inverse system method. Simulation has been done on a three-machine power system and the results show that the proposed IOSC controller can effectively attenuate system oscillations and enhance the power system transient stability
[ "intelligent optimal sieving method", "intelligent optimal sieve", "FACTS", "FACTS device control", "multi-machine systems", "multi-target oriented optimal control strategy", "artificial neural network technologies", "fuzzy methods", "control signal", "selected optimal output", "inverse system method", "three-machine power system", "intelligent control", "system oscillations attenuation", "power system transient stability enhancement" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1244
Applied ethics in business information units
The primary thesis of this paper is that business information professionals commonly overlook ethical dilemmas in the workplace. Although the thesis remains unproven, the author highlights, by way of real and hypothetical case studies, a number of situations in which ethical tensions can be identified, and suggests that information professionals need to be more aware of the moral context of their actions. Resolving ethical dilemmas should be one of the aims of competent information professionals and their managers, although it is recognized that dilemmas often cannot easily be resolved. A background to the main theories of applied ethics forms the framework for later discussion
[ "applied ethics", "business information units", "business information professionals", "ethical dilemmas", "moral context" ]
[ "P", "P", "P", "P", "P" ]
1201
Moving into the mainstream [product lifecycle management]
Product lifecycle management (PLM) is widely recognised by most manufacturing companies, as manufacturers begin to identify and implement targeted projects intended to deliver return-on investment in a timely fashion. Vendors are also releasing second-generation PLM products that are packaged, out-of-the-box solutions
[ "product lifecycle management", "manufacturing companies", "product data management", "product development", "enterprise resource planning" ]
[ "P", "P", "M", "M", "U" ]
555
Computing transient gating charge movement of voltage-dependent ion channels
The opening of voltage-gated sodium, potassium, and calcium ion channels has a steep relationship with voltage. In response to changes in the transmembrane voltage, structural movements of an ion channel that precede channel opening generate a capacitative gating current. The net gating charge displacement due to membrane depolarization is an index of the voltage sensitivity of the ion channel activation process. Understanding the molecular basis of voltage-dependent gating of ion channels requires the measurement and computation of the gating charge, Q. We derive a simple and accurate semianalytic approach to computing the voltage dependence of transient gating charge movement (Q-V relationship) of discrete Markov state models of ion channels using matrix methods. This approach allows rapid computation of Q-V curves for finite and infinite length step depolarizations and is consistent with experimentally measured transient gating charge. This computational approach was applied to Shaker potassium channel gating, including the impact of inactivating particles on potassium channel gating currents
[ "transient gating charge movement", "charge movement", "ion channels", "transmembrane voltage", "gating current", "Markov state model", "inactivation", "action potentials", "immobilization" ]
[ "P", "P", "P", "P", "P", "P", "P", "U", "U" ]
982
Abundance of mosaic patterns for CNN with spatially variant templates
This work investigates the complexity of one-dimensional cellular neural network mosaic patterns with spatially variant templates on finite and infinite lattices. Various boundary conditions are considered for finite lattices and the exact number of mosaic patterns is computed precisely. The entropy of mosaic patterns with periodic templates can also be calculated for infinite lattices. Furthermore, we show the abundance of mosaic patterns with respect to template periods and, which differ greatly from cases with spatially invariant templates
[ "mosaic patterns", "CNN", "spatially variant templates", "one-dimensional cellular neural network", "infinite lattices", "finite lattices", "boundary conditions", "spatial entropy", "transition matrix" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "U" ]
1145
Mammogram synthesis using a 3D simulation. II. Evaluation of synthetic mammogram texture
We have evaluated a method for synthesizing mammograms by comparing the texture of clinical and synthetic mammograms. The synthesis algorithm is based upon simulations of breast tissue and the mammographic imaging process. Mammogram texture was synthesized by projections of simulated adipose tissue compartments. It was hypothesized that the synthetic and clinical texture have similar properties, assuming that the mammogram texture reflects the 3D tissue distribution. The size of the projected compartments was computed by mathematical morphology. The texture energy and fractal dimension were also computed and analyzed in terms of the distribution of texture features within four different tissue regions in clinical and synthetic mammograms. Comparison of the cumulative distributions of the mean features computed from 95 mammograms showed that the synthetic images simulate the mean features of the texture of clinical mammograms. Correlation of clinical and synthetic texture feature histograms, averaged over all images, showed that the synthetic images can simulate the range of features seen over a large group of mammograms. The best agreement with clinical texture was achieved for simulated compartments with radii of 4-13.3 mm in predominantly adipose tissue regions, and radii of 2.7-5.33 and 1.3-2.7 mm in retroareolar and dense fibroglandular tissue regions, respectively
[ "mammogram synthesis", "3D simulation", "synthetic mammogram texture", "adipose tissue compartments", "3D tissue distribution", "mathematical morphology", "fractal dimension", "cumulative distributions", "synthetic images", "dense fibroglandular tissue regions", "breast tissue simulation", "retroareolar tissue regions", "X-ray image acquisition", "computationally compressed phantom" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M", "M" ]
1100
Evaluation of existing and new feature recognition algorithms. 2. Experimental results
For pt.1 see ibid., p.839-851. This is the second of two papers investigating the performance of general-purpose feature detection techniques. The first paper describes the development of a methodology to synthesize possible general feature detection face sets. Six algorithms resulting from the synthesis have been designed and implemented on a SUN Workstation in C++ using ACIS as the geometric modelling system. In this paper, extensive tests and comparative analysis are conducted on the feature detection algorithms, using carefully selected components from the public domain, mostly from the National Design Repository. The results show that the new and enhanced algorithms identify face sets that previously published algorithms cannot detect. The tests also show that each algorithm can detect, among other types, a certain type of feature that is unique to it. Hence, most of the algorithms discussed in this paper would have to be combined to obtain complete coverage
[ "feature recognition algorithms", "general-purpose feature detection techniques", "face sets", "National Design Repository", "convex hull", "concavity" ]
[ "P", "P", "P", "P", "U", "U" ]
93
Help-desk support is key to wireless success [finance]
A well thought out help desk can make or break an institution's mobile play. Schwab, Ameritrade and RBC are taking their support function seriously
[ "wireless", "finance", "help desk", "Schwab", "Ameritrade", "RBC" ]
[ "P", "P", "P", "P", "P", "P" ]
568
Modeling cutting temperatures for turning inserts with various tool geometries and materials
Temperatures are of interest in machining because cutting tools often fail by thermal softening or temperature-activated wear. Many models for cutting temperatures have been developed, but these models consider only simple tool geometries such as a rectangular slab with a sharp corner. This report describes a finite element study of tool temperatures in cutting that accounts for tool nose radius and included angle effects. A temperature correction factor model that can be used in the design and selection of inserts is developed to account for these effects. A parametric mesh generator is used to generate the finite element models of tool and inserts of varying geometries. The steady-state temperature response is calculated using NASTRAN solver. Several finite element analysis (FEA) runs are performed to quantify the effects of inserts included angle, nose radius, and materials for the insert and the tool holder on the cutting temperature at the insert rake face. The FEA results are then utilized to develop a temperature correction factor model that accounts for these effects. The temperature correction factor model is integrated with an analytical temperature model for rectangular inserts to predict cutting temperatures for contour turning with inserts of various shapes and nose radii. Finally, experimental measurements of cutting temperature using the tool-work thermocouple technique are performed and compared with the predictions of the new temperature model. The comparisons show good agreement
[ "turning inserts", "tool geometries", "machining", "tool nose radius", "temperature correction factor", "parametric mesh generator", "finite element models", "cutting temperature model", "insert shape effects" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
1178
Network-centric systems
The author describes a graduate-level course that addresses cutting-edge issues in network-centric systems while following a more traditional graduate seminar format
[ "network-centric systems", "graduate level course" ]
[ "P", "M" ]
1284
A linear time special case for MC games
MC games are infinite duration two-player games played on graphs. Deciding the winner in MC games is equivalent to the the modal mu-calculus model checking. In this article we provide a linear time algorithm for a class of MC games. We show that, if all cycles in each strongly connected component of the game graph have at least one common vertex, the winner can be found in linear time. Our results hold also for parity games, which are equivalent to MC games
[ "linear time special case", "MC games", "two-player games", "modal mu-calculus model checking", "linear time algorithm" ]
[ "P", "P", "P", "P", "P" ]
694
A novel genetic algorithm for the design of a signed power-of-two coefficient quadrature mirror filter lattice filter bank
A novel genetic algorithm (GA) for the design of a canonical signed power-of-two (SPT) coefficient lattice structure quadrature mirror filter bank is presented. Genetic operations may render the SPT representation of a value noncanonical. A new encoding scheme is introduced to encode the SPT values. In this new scheme, the canonical property of the SPT values is preserved under genetic operations. Additionally, two new features that drastically improve the performance of our GA are introduced. (1) An additional level of natural selection is introduced to simulate the effect of natural selection when sperm cells compete to fertilize an ovule; this dramatically improves the offspring survival rate. A conventional GA is analogous to intracytoplasmic sperm injection and has an extremely low offspring survival rate, resulting in very slow convergence. (2) The probability of mutation for each codon of a chromosome is weighted by the reciprocal of its effect. Because of these new features, the performance of our new GA outperforms conventional GAs
[ "genetic algorithm", "quadrature mirror filter", "lattice filter bank", "encoding scheme", "natural selection", "offspring survival rate", "signed power-of-two coefficient lattice structure", "QMF", "chromosome codon", "signal processing", "perfect reconstruction" ]
[ "P", "P", "P", "P", "P", "P", "R", "U", "R", "U", "U" ]
1279
Place/Transition Petri net evolutions: recording ways, analysis and synthesis
Four semantic domains for Place/Transition Petri nets and their relationships are considered. They are monoids of respectively: firing sequences, processes, traces and dependence graphs. For each of them the analysis and synthesis problem is stated and solved. The monoid of processes is defined in a non-standard way, Nets under consideration involve weights of arrows and capacities (finite or infinite) of places. However, the analysis and synthesis tasks require nets to be pure, i.e. each of their transition must have the pre-set and post-set disjoint
[ "place/transition Petri net evolutions", "semantic domains", "monoids", "firing sequences", "dependence graphs", "post-set disjoint", "pre-set disjoint" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1185
Trading exchanges: online marketplaces evolve
Looks at how trading exchanges are evolving rapidly to help manufacturers keep up with customer demand
[ "trading exchanges", "online marketplaces", "manufacturers", "customer demand", "enterprise platforms", "supply chain management", "enterprise resource planning", "core software platform", "private exchanges", "integration technology", "middleware", "XML standards", "content management capabilities" ]
[ "P", "P", "P", "P", "U", "U", "U", "U", "M", "U", "U", "U", "U" ]
907
Development of an integrated and open-architecture precision motion control system
In this paper, the development of an integrated and open-architecture precision motion control system is presented. The control system is generally applicable, but it is developed with a particular focus on direct drive servo systems based on linear motors. The overall control system is comprehensive, comprising of various selected control and instrumentation components, integrated within a configuration of hardware architecture centred around a dSPACE DS1004 DSP processor board. These components include a precision composite controller (comprising of feedforward and feedback control), a disturbance observer, an adaptive notch filter, and a geometrical error compensator. The hardware architecture, software development platform, user interface, and all constituent control components are described
[ "open-architecture", "precision", "motion control", "direct drive servo systems", "linear motors", "composite controller", "feedforward", "feedback", "adaptive notch filter", "geometrical error compensation", "dSPACE DS1004 processor" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
144
Development of a 3.5 inch magneto-optical disk with a capacity of 2.3 GB
The recording capacity of GIGAMO media was enlarged from 1.3 GB to 2.3 GB for 3.5 inch magneto-optical (MO) disks while maintaining downward compatibility. For the new GIGAMO technology, a land and groove recording method was applied in addition to magnetically induced super resolution (MSR) media. Furthermore, a novel address format suitable for the land and groove recording method was adopted. The specifications of the new GIGAMO media were examined to satisfy requirements for practical use with respect to margins. Durability of more than 10/sup 6/ rewritings and an enough lifetime were confirmed
[ "3.5 inch", "magneto-optical disk", "2.3 GB", "recording capacity", "GIGAMO media", "magnetically induced super resolution", "MSR", "address format", "lifetime", "MO disks", "land-groove recording method", "rewriting durability", "crosstalk", "SiN-GdFeCo-GdFe-TbFeCo-SiN-Al" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R", "U", "U" ]
595
Six common enterprise programming mistakes
Instead of giving you tips to use in your programming (at least directly), I want to look at some common mistakes made in enterprise programming. Instead of focusing on what to do, I want to look at what you should not do. Most programmers take books like mine and add in the good things, but they leave their mistakes in the very same programs! So I touch on several common errors I see in enterprise programming, and then briefly mention how to avoid those mistakes
[ "enterprise programming mistakes", "common errors", "data store", "database", "XML", "Enterprise JavaBeans", "vendor-specific programming" ]
[ "P", "P", "U", "U", "U", "M", "M" ]
942
Micro-optical realization of arrays of selectively addressable dipole traps: a scalable configuration for quantum computation with atomic qubits
We experimentally demonstrate novel structures for the realization of registers of atomic qubits: We trap neutral atoms in one- and two-dimensional arrays of far-detuned dipole traps obtained by focusing a red-detuned laser beam with a microfabricated array of microlenses. We are able to selectively address individual trap sites due to their large lateral separation of 125 mu m. We initialize and read out different internal states for the individual sites. We also create two interleaved sets of trap arrays with adjustable separation, as required for many proposed implementations of quantum gate operations
[ "scalable configuration", "quantum computation", "atomic qubits", "registers", "neutral atoms", "far-detuned dipole traps", "red-detuned laser beam", "microfabricated array", "microlenses", "internal states", "quantum gate operations" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1364
An adaptive sphere-fitting method for sequential tolerance control
The machining of complex parts typically involves a logical and chronological sequence of n operations on m machine tools. Because manufacturing datums cannot always match design constraints, some of the design specifications imposed on the part are usually satisfied by distinct subsets of the n operations prescribed in the process plan. Conventional tolerance control specifies a fixed set point for each operation and a permissible variation about this set point to insure compliance with the specifications, whereas sequential tolerance control (STC) uses real-time measurement information at the completion of one stage to reposition the set point for subsequent operations. However, it has been shown that earlier sphere-fitting methods for STC can lead to inferior solutions when the process distributions are skewed. This paper introduces an extension of STC that uses an adaptive sphere-fitting method that significantly improves the yield in the presence of skewed distributions as well as significantly reducing the computational effort required by earlier probabilistic search methods
[ "adaptive sphere-fitting method", "sequential tolerance control", "machine tools", "design constraints", "compliance", "real-time measurement information", "skewed distributions", "computational effort", "yield improvement" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
731
Aggregate bandwidth estimation in stored video distribution systems
Multimedia applications like video on demand, distance learning, Internet video broadcast, etc. will play a fundamental role in future broadband networks. A common aspect of such applications is the transmission of video streams that require a sustained relatively high bandwidth with stringent requirements of quality of service. In this paper various original algorithms for evaluating, in a video distribution system, a statistical estimation of aggregate bandwidth needed by a given number of smoothed video streams are proposed and discussed. The variable bit rate traffic generated by each video stream is characterized by its marginal distribution and by conditional probabilities between rates of temporary closed streams. The developed iterative algorithms evaluate an upper and lower bound of needed bandwidth for guaranteeing a given loss probability. The obtained results are compared with simulations and with other results, based on similar assumptions, already presented in the literature. Some considerations on the developed algorithms are made, in order to evaluate the effectiveness of the proposed methods
[ "aggregate bandwidth estimation", "stored video distribution systems", "multimedia applications", "video on demand", "distance learning", "Internet video broadcast", "broadband networks", "quality of service", "statistical estimation", "variable bit rate traffic", "marginal distribution", "conditional probabilities", "temporary closed streams", "iterative algorithms", "lower bound", "loss probability", "simulations", "video streams transmission", "upper bound", "VoD", "video coding", "QoS" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U", "M", "U" ]
774
Keeping Web accessibility in mind: I&R services for all
After presenting three compelling reasons for making Web sites accessible to persons with a broad range of disabilities (it's the morally right thing to do, it's the smart thing to do from an economic perspective, and it's required by law), the author discusses design issues that impact persons with particular types of disabilities. She presents practical advice for assessing and addressing accessibility problems. An extensive list of resources for further information is appended, as is a list of sites which simulate the impact of specific accessibility problems on persons with disabilities
[ "Web site accessibility", "disabilities", "information and referral services" ]
[ "P", "P", "M" ]
1449
Raising the standard of management education for electronic commerce professionals
The teaching of electronic commerce in universities has become a growth industry in itself. The rapid expansion of electronic commerce programmes raises the question of what actually is being taught. The association of electronic commerce as primarily a technical or information technology (IT) phenomenon has not been sufficient to constrain it to IT and information systems departments. Business schools have been keen entrants into the electronic commerce coursework race and they are developing electronic commerce programmes in an environment where there is no agreed definition of the term. This paper draws on the work of Kenneth Boulding who argued that the dynamics of change in society are largely a product of changing skills and the way these skills are arranged into roles at the organizational level. It is argued that an overly technical interpretation of electronic commerce narrows the skills being acquired as part of formal education. Universities, under pressure from the market and technological change, are changing their roles resulting in a further narrowing of the breadth of issues that is seen as legitimate to be included as electronic commerce. The outcome is that aspiring electronic commerce professionals are not being exposed to a wide enough agenda of ideas and concepts that will assist them to make better business decisions
[ "electronic commerce professionals", "universities", "IT", "information technology", "information systems", "business schools", "Kenneth Boulding", "organizational level", "formal education", "management education standards improvement" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
1098
Instability phenomena in the gas-metal arc welding self-regulation process
Arc instability is a very important determinant of weld quality. The instability behaviour of the gas-metal arc welding (GMAW) process is characterized by strong oscillations in arc length and current. In the paper, a model of the GMAW process is developed using an exact arc voltage characteristic. This model is used to study stability of the self-regulation process and to develop a simulation program that helps to understand the transient or dynamic nature of the GMAW process and relationships among current, electrode extension and contact tube-work distance. The process is shown to exhibit instabilities at both long electrode extension and normal extension. Results obtained from simulation runs of the model were also experimentally confirmed by the present author, as reported in this study. In order to explain the concept of the instability phenomena, the metal transfer mode and the arc voltage-current characteristic were examined. Based on this examination, the conclusion of this study is that their combined effects lead to the oscillations in arc current and length
[ "instability phenomena", "gas-metal arc welding", "self-regulation process", "arc instability", "weld quality", "GMAW process", "exact arc voltage characteristic", "metal transfer mode" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1020
Supersampling multiframe blind deconvolution resolution enhancement of adaptive optics compensated imagery of low earth orbit satellites
We describe a postprocessing methodology for reconstructing undersampled image sequences with randomly varying blur that can provide image enhancement beyond the sampling resolution of the sensor. This method is demonstrated on simulated imagery and on adaptive-optics-(AO)-compensated imagery taken by the Starfire Optical Range 3.5-m telescope that has been artificially undersampled. Also shown are the results of multiframe blind deconvolution of some of the highest quality optical imagery of low earth orbit satellites collected with a ground-based telescope to date. The algorithm used is a generalization of multiframe blind deconvolution techniques that include a representation of spatial sampling by the focal plane array elements based on a forward stochastic model. This generalization enables the random shifts and shape of the AO-compensated point spread function (PSF) to be used to partially eliminate the aliasing effects associated with sub-Nyquist sampling of the image by the focal plane array. The method could be used to reduce resolution loss that occurs when imaging in wide-field-of-view (FOV) modes
[ "supersampling multiframe blind deconvolution resolution enhancement", "multiframe blind deconvolution", "adaptive optics compensated imagery", "low earth orbit satellites", "postprocessing methodology", "randomly varying blur", "image enhancement", "simulated imagery", "ground-based telescope", "spatial sampling", "focal plane array elements", "forward stochastic model", "random shifts", "AO-compensated point spread function", "aliasing effects", "sub-Nyquist sampling", "resolution loss", "undersampled image sequence reconstruction", "sensor sampling resolution", "Starfire Optical Range telescope", "wide-field-of-view modes", "3.5 m" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "U" ]
1065
Quantum universal variable-length source coding
We construct an optimal quantum universal variable-length code that achieves the admissible minimum rate, i.e., our code is used for any probability distribution of quantum states. Its probability of exceeding the admissible minimum rate exponentially goes to 0. Our code is optimal in the sense of its exponent. In addition, its average error asymptotically tends to 0
[ "quantum universal variable-length source coding", "optimal quantum universal variable-length code", "admissible minimum rate", "probability distribution", "quantum states", "exponent", "average error", "quantum information theory", "quantum cryptography", "optimal code" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "M", "R" ]
8
New investors get steal of a deal [Global Crossing]
Hutchison Telecommunications and Singapore Technologies take control of Global Crossing for a lot less money than they originally offered. The deal leaves the bankrupt carrier intact, but doesn't put it in the clear just yet
[ "Global Crossing", "Hutchison Telecommunications", "Singapore Technologies", "bankrupt" ]
[ "P", "P", "P", "P" ]
923
Design and manufacture of a lightweight piezo-composite curved actuator
In this paper we are concerned with the design, manufacture and performance test of a lightweight piezo-composite curved actuator (called LIPCA) using a top carbon fiber composite layer with near-zero coefficient of thermal expansion (CTE), a middle PZT ceramic wafer, and a bottom glass/epoxy layer with a high CTE. The main point of the design for LIPCA is to replace the heavy metal layers of THUNDER TM by lightweight fiber reinforced plastic layers without losing the capabilities for generating high force and large displacement. It is possible to save up to about 40% of the weight if we replace the metallic backing material by the light fiber composite layer. We can also have design flexibility by selecting the fiber direction and the size of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use an epoxy resin prepreg system. Glass/epoxy prepregs, a ceramic wafer with electrode surfaces, and a carbon prepreg were simply stacked and cured at an elevated temperature (177 degrees C) after following an autoclave bagging process. We found that the manufactured composite laminate device had a sufficient curvature after being detached from a flat mould. An analysis method using the classical lamination theory is presented to predict the curvature of LIPCA after curing at an elevated temperature. The predicted curvatures are in quite good agreement with the experimental values. In order to investigate the merits of LIPCA, performance tests of both LIPCA and THUNDER TM have been conducted under the same boundary conditions. From the experimental actuation tests, it was observed that the developed actuator could generate larger actuation displacement than THUNDER TM
[ "lightweight piezo-composite curved actuator", "performance test", "performance test", "LIPCA", "carbon fiber composite layer", "near-zero coefficient of thermal expansion", "PZT ceramic wafer", "glass/epoxy layer", "THUNDER", "fiber reinforced plastic layers", "177 degrees C", "predicted curvatures", "boundary conditions", "performance tests", "177 degC" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
966
Controlling in between the Lorenz and the Chen systems
This letter investigates a new chaotic system and its role as a joint function between two complex chaotic systems, the Lorenz and the Chen systems, using a simple variable constant controller. With the gradual tuning of the controller, the controlled system evolves from the canonical Lorenz attractor to the Chen attractor through the new transition chaotic attractor. This evolving procedure reveals the forming mechanisms of all similar and closely related chaotic systems, and demonstrates that a simple control technique can be very useful in generating and analyzing some complex chaotic dynamical phenomena
[ "Chen system", "tuning", "Lorenz attractor", "Chen attractors", "transition chaotic attractor", "Lorenz system" ]
[ "P", "P", "P", "P", "P", "R" ]
125
A fast implementation of correlation of long data sequences for coherent receivers
Coherent reception depends upon matching of phase between the transmitted and received signal. Fast convolution techniques based on fast Fourier transform (FFT) are widely used for extracting time delay information from such matching. The latency in processing a large data window of the received signal is a serious overhead for mission critical real time applications. The implementation of a parallel algorithm for correlation of long data sequences in multiprocessor environment is demonstrated here. The algorithm does processing while acquiring the received signal and reduces the computation overhead considerably because of inherent parallelism
[ "correlation", "long data sequences", "coherent receivers", "received signal", "fast Fourier transform", "time delay information", "latency", "mission critical real time applications", "parallel algorithm", "multiprocessor environment", "computation", "transmitted signal" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
77
Modeling frequently accessed wireless data with weak consistency
To reduce the response times of wireless data access in a mobile network, caches are utilized in wireless handheld devices. If the original data entry has been updated, the cached data in the handheld device becomes stale. Thus, a mechanism is required to predict when the cached copy will expire. This paper studies a weakly consistent data access mechanism that computes the time-to-live (TTL) interval to predict the expiration time. We propose an analytic model to investigate this TTL-based algorithm for frequently accessed data. The analytic model is validated against simulation experiments. Our study quantitatively indicates how the TTL-based algorithm reduces the wireless communication cost by increasing the probability of stale accesses. Depending on the requirements of the application, appropriate parameter values can be selected based on the guidelines provided
[ "weak consistency", "wireless data access", "mobile network", "caches", "wireless handheld devices", "data entry", "analytic model", "simulation experiments", "wireless communication cost", "frequently accessed wireless data modeling", "response time reduction", "time-to-live interval", "expiration time prediction", "stale access probability" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R", "R", "R" ]
608
How closely can a personal computer clock track the UTC timescale via the Internet?
Nowadays many software packages allow you to keep the clock of your personal computer synchronized to time servers spread over the internet. We present how a didactic laboratory can evaluate, in a statistical sense, the minimum synch error of this process (the other extreme, the maximum, is guaranteed by the code itself). The measurement set-up utilizes the global positioning system satellite constellation in 'common view' between two similar timing stations: one acts as a time server for the other, so the final timing difference at the second station represents the total synch error through the internet. Data recorded over batches of 10000 samples show a typical RMS value of 35 ms. This measurement configuration allows students to obtain a much better understanding of the synch task and pushes them, at all times, to look for an experimental verification of data results, even when they come from the most sophisticated 'black boxes' now readily available off the shelf
[ "personal computer clock", "UTC timescale", "internet", "software packages", "time servers", "didactic laboratory", "statistical sense", "synch error", "global positioning system satellite constellation", "final timing difference", "black boxes" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1218
Knowledge acquisition for expert systems in accounting and financial problem domains
Since the mid-1980s, expert systems have been developed for a variety of problems in accounting and finance. The most commonly cited problems in developing these systems are the unavailability of the experts and knowledge engineers and difficulties with the rule extraction process. Within the field of artificial intelligence, this has been called the 'knowledge acquisition' (KA) problem and has been identified as a major bottleneck in the expert system development process. Recent empirical research reveals that certain KA techniques are significantly more efficient than others in helping to extract certain types of knowledge within specific problem domains. This paper presents a mapping between these empirical studies and a generic taxonomy of expert system problem domains. To accomplish this, we first examine the range of problem domains and suggest a mapping of accounting and finance tasks to a generic problem domain taxonomy. We then identify and describe the most prominent KA techniques employed in developing expert systems in accounting and finance. After examining and summarizing the existing empirical KA work, we conclude by showing how the empirical KA research in the various problem domains can be used to provide guidance to developers of expert systems in the fields of accounting and finance
[ "knowledge acquisition", "expert systems", "accounting", "finance", "rule extraction process", "artificial intelligence", "problem domain taxonomy" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
1119
A component-based software configuration management model and its supporting system
Software configuration management (SCM) is an important key technology in software development. Component-based software development (CBSD) is an emerging paradigm in software development. However, to apply CBSD effectively in real world practice, supporting SCM in CBSD needs to be further investigated. In this paper, the objects that need to be managed in CBSD is analyzed and a component-based SCM model is presented. In this model, components, as the integral logical constituents in a system, are managed as the basic configuration items in SCM, and the relationships between/among components are defined and maintained. Based on this model, a configuration management system is implemented
[ "component-based software configuration management model", "software development", "integral logical constituents", "software reuse", "version control" ]
[ "P", "P", "P", "M", "U" ]
1004
Games machines play
Individual rationality, or doing what is best for oneself, is a standard model used to explain and predict human behavior, and von Neumann-Morgenstern game theory is the classical mathematical formalization of this theory in multiple-agent settings. Individual rationality, however, is an inadequate model for the synthesis of artificial social systems where cooperation is essential, since it does not permit the accommodation of group interests other than as aggregations of individual interests. Satisficing game theory is based upon a well-defined notion of being good enough, and does accommodate group as well as individual interests through the use of conditional preference relationships, whereby a decision maker is able to adjust its preferences as a function of the preferences, and not just the options, of others. This new theory is offered as an alternative paradigm to construct artificial societies that are capable of complex behavior that goes beyond exclusive self interest
[ "individual rationality", "human behavior", "game theory", "multiple-agent", "artificial social systems", "cooperation", "conditional preference relationships", "artificial societies", "self interest", "decision theory", "group rationality" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
1041
Fractional differentiation in passive vibration control
From a single-degree-of-freedom model used to illustrate the concept of vibration isolation, a method to transform the design for a suspension into a design for a robust controller is presented. Fractional differentiation is used to model the viscoelastic behaviour of the suspension. The use of fractional differentiation not only permits optimisation of just four suspension parameters, showing the 'compactness' of the fractional derivative operator, but also leads to robustness of the suspension's performance to uncertainty of the sprung mass. As an example, an engine suspension is studied
[ "fractional differentiation", "passive vibration control", "vibration isolation", "suspension", "robust controller", "viscoelastic behaviour", "sprung mass", "engine suspension" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
886
A fractional-flow model of serial manufacturing systems with rework and its reachability and controllability properties
A dynamic fractional-flow model of a serial manufacturing system incorporating rework is considered. Using some results on reachability and controllability of positive linear systems the ability of serial manufacturing systems with rework to "move in space", that is their reachability and controllability properties, are studied. These properties are important not only for optimising the performance of the manufacturing system, possibly off-line, but also to improve its functioning by using feedback control online
[ "serial manufacturing systems", "rework", "reachability", "controllability", "dynamic fractional-flow model", "positive linear systems", "feedback control", "performance optimisation" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
1428
Syndicators turn to the enterprise
Syndicators have started reshaping offerings, products, and services towards the marketplace that was looking for enterprise-wide content syndication technology and service. Syndication companies are turning themselves into infrastructure companies. Many syndication companies are now focusing their efforts on enterprise clients instead of the risky dot coms
[ "enterprise-wide content syndication technology", "infrastructure companies", "enterprise clients", "business model", "aggregator", "business Web sites", "customer base" ]
[ "P", "P", "P", "U", "U", "U", "U" ]
1305
Learning nonregular languages: a comparison of simple recurrent networks and LSTM
Rodriguez (2001) examined the learning ability of simple recurrent nets (SRNs) (Elman, 1990) on simple context-sensitive and context-free languages. In response to Rodriguez's (2001) article, we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages
[ "LSTM", "context-free languages", "performance", "short-term memory recurrent nets", "context-sensitive languages", "nonregular language learning", "recurrent neural networks" ]
[ "P", "P", "P", "P", "P", "R", "M" ]
1340
Orthogonal decompositions of complete digraphs
A family G of isomorphic copies of a given digraph G is said to be an orthogonal decomposition of the complete digraph D/sub n/ by G, if every arc of D/sub n/ belongs to exactly one member of G and the union of any two different elements from G contains precisely one pair of reverse arcs. Given a digraph h, an h family mh is the vertex-disjoint union of m copies of h . In this paper, we consider orthogonal decompositions by h-families. Our objective is to prove the existence of such an orthogonal decomposition whenever certain necessary conditions hold and m is sufficiently large
[ "orthogonal decompositions", "complete digraphs", "isomorphic copies", "vertex-disjoint union", "necessary conditions" ]
[ "P", "P", "P", "P", "P" ]
715
The quadratic 0-1 knapsack problem with series-parallel support
We consider various special cases of the quadratic 0-1 knapsack problem (QKP) for which the underlying graph structure is fairly simple. For the variant with edge series-parallel graphs, we give a dynamic programming algorithm with pseudo-polynomial time complexity, and a fully polynomial time approximation scheme. In strong contrast to this, the variant with vertex series-parallel graphs is shown to be strongly NP-complete
[ "quadratic 0-1 knapsack problem", "series-parallel support", "underlying graph structure", "dynamic programming algorithm", "pseudo-polynomial time complexity", "fully polynomial time approximation scheme", "NP-complete problem" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
750
Automated cerebrum segmentation from three-dimensional sagittal brain MR images
We present a fully automated cerebrum segmentation algorithm for full three-dimensional sagittal brain MR images. First, cerebrum segmentation from a midsagittal brain MR image is performed utilizing landmarks, anatomical information, and a connectivity-based threshold segmentation algorithm as previously reported. Recognizing that the cerebrum in laterally adjacent slices tends to have similar size and shape, we use the cerebrum segmentation result from the midsagittal brain MR image as a mask to guide cerebrum segmentation in adjacent lateral slices in an iterative fashion. This masking operation yields a masked image (preliminary cerebrum segmentation) for the next lateral slice, which may truncate brain region(s). Truncated regions are restored by first finding end points of their boundaries, by comparing the mask image and masked image boundaries, and then applying a connectivity-based algorithm. The resulting final extracted cerebrum image for this slice is then used as a mask for the next lateral slice. The algorithm yielded satisfactory fully automated cerebrum segmentations in three-dimensional sagittal brain MR images, and had performance superior to conventional edge detection algorithms for segmentation of cerebrum from 3D sagittal brain MR images
[ "fully automated cerebrum segmentation algorithm", "midsagittal brain MR image", "landmarks", "anatomical information", "connectivity-based threshold segmentation algorithm", "laterally adjacent slices", "masking operation", "masked image boundaries", "connectivity-based algorithm", "full 3D sagittal brain MR images", "brain region truncation", "boundary end points" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
846
Female computer science doctorates: what does the survey of earned doctorates reveal?
Based on the National Center for Education Statistics (2000), in the 1997-1998 academic year 26.7% of earned bachelors' degrees, 29.0% of earned masters' degrees and 16.3% of earned doctorates' degrees in computer science were awarded to women. As these percentages suggest, women are underrepresented at all academic levels in computer science (Camp, 1997). The most severe shortage occurs at the top level-the doctorate in computer science. We know very little about the women who persist to the top level of academic achievement in computer science. This paper examines a subset of data collected through the Survey of Earned Doctorates (SED). The specific focus of this paper is to identify trends that have emerged from the SED with respect to females completing doctorates in computer science between the academic years 1990-1991 and 1999-2000. Although computer science doctorates include doctorates in information science, prior research (Camp, 1997) suggests that the percentage of women completing doctorates in information science as compared to computer science is low. The specific research questions are: 1. How does the percentage of women who complete doctorates in computer science compare to those that complete doctorates in other fields? 2. How does the length of time in school and the sources of funding differ for females as compared to males who complete doctorates in computer science? 3. Where do women go after completing doctorates in computer science and what positions do they acquire? How do these experiences differ from their male peers?
[ "female computer science doctorates", "Survey of Earned Doctorates", "information science" ]
[ "P", "P", "P" ]
803
The mutual effects of grid and wind turbine voltage stability control
This note considers the results of wind turbine modelling and power system stability investigations. Voltage stability of the power grid with grid-connected wind turbines will be improved by using blade angle control for a temporary reduction of the wind turbine power during and shortly after a short circuit fault in the grid
[ "wind turbine voltage stability control", "wind turbine modelling", "power system stability", "power grid", "grid-connected wind turbines", "blade angle control", "short circuit fault", "grid voltage stability control", "wind turbine power reduction", "offshore wind turbines" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "R", "M" ]
1415
The disconnect continues [digital content providers]
The relationships between the people who buy digital content and those who sell it are probably more acrimonious than ever before, says Dick Curtis, a director and lead analyst for the research firm Outsell Inc., where he covers econtent contract and negotiation strategies. Several buyers agree with his observation. They cite aggressive sales tactics, an unwillingness to deliver content in formats buyers need, a reluctance to provide licensing terms that take into account the structure of today's corporations, and inadequate service and support as a few of the factors underlying the acrimony. Still, many buyers remain optimistic that compromises can be reached on some of these issues. But first, they say, sellers must truly understand the econtent needs of today's enterprises
[ "digital content", "econtent contract", "sales tactics", "econtent negotiation", "econtent buyers", "news databases", "Web site" ]
[ "P", "P", "P", "R", "R", "U", "U" ]
1081
Stability of W-methods with applications to operator splitting and to geometric theory
We analyze the stability properties of W-methods applied to the parabolic initial value problem u' + Au = Bu. We work in an abstract Banach space setting, assuming that A is the generator of an analytic semigroup and that B is relatively bounded with respect to A. Since W-methods treat the term with A implicitly, whereas the term involving B is discretized in an explicit way, they can be regarded as splitting methods. As an application of our stability results, convergence for nonsmooth initial data is shown. Moreover, the layout of a geometric theory for discretizations of semilinear parabolic problems u' + Au = f (u) is presented
[ "operator splitting", "geometric theory", "parabolic initial value problem", "abstract Banach space", "analytic semigroup", "nonsmooth initial data", "W-methods stability", "linearly implicit Runge-Kutta methods" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
1450
Networking in the palm of your hand [PDA buyer's guide]
As PDAs move beyond the personal space and into the enterprise, you need to get a firm grip on the options available for your users. What operating system do you choose? What features do you and your company need? How will these devices fit into the existing corporate infrastructure? What about developer support?
[ "PDAs", "buyer's guide", "operating system", "corporate infrastructure", "developer support" ]
[ "P", "P", "P", "P", "P" ]
1338
The chromatic spectrum of mixed hypergraphs
A mixed hypergraph is a triple H = (X, C, D), where X is the vertex set, and each of C, D is a list of subsets of X. A strict k-coloring of H is a surjection c : X {1,..., k} such that each member of le has two vertices assigned a common value and each member of D has two vertices assigned distinct values. The feasible set of H is {k: H has a strict k-coloring}. Among other results, we prove that a finite set of positive integers is the feasible set of some mixed hypergraph if and only if it omits the number I or is an interval starting with 1. For the set {s, t} with 2 <or= s <or= t - 2, the smallest realization has 2t - s vertices. When every member of C union D is a single interval in an underlying linear order on the vertices, the feasible set is also a single interval of integers
[ "chromatic spectrum", "mixed hypergraphs", "mixed hypergraphs", "vertex set", "strict k-coloring", "positive integers", "mixed hypergraph" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
728
Questioning the RFP process [telecom]
In the current climate, the most serious concern about the purchasing habits of telecom carriers is obviously the lack of spending. Even against a backdrop of economic constraints and financial struggles, however, genuine concerns about the purchasing process itself are being raised by some of those closest to it
[ "telecom carriers", "purchasing process", "sales cycle", "request for information", "request for proposal" ]
[ "P", "P", "U", "U", "U" ]
790
Data assimilation of local model error forecasts in a deterministic model
One of the most popular data assimilation techniques in use today are of the Kalman filter type, which provide an improved estimate of the state of a system up to the current time level, based on actual measurements. From a forecasting viewpoint, this corresponds to an updating of the initial conditions. The standard forecasting procedure is to then run the model uncorrected into the future, driven by predicted boundary and forcing conditions. The problem with this methodology is that the updated initial conditions quickly 'wash-out', thus, after a certain forecast horizon the model predictions are no better than from an initially uncorrected model. This study demonstrates that through the assimilation of error forecasts (in the present case made using so-called local models) entire model domains can be corrected for extended forecast horizons (i.e. long after updated initial conditions have become washed-out), thus demonstrating significant improvements over the conventional methodology. Some alternate uses of local models are also explored for the re-distribution of error forecasts over the entire model domain, which are then compared with more conventional Kalman filter type schemes
[ "data assimilation", "local model error forecasts", "deterministic model", "Kalman filter", "forcing conditions", "forecast horizon", "error prediction", "hydrodynamic modelling" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
1380
A feature-preserving volumetric technique to merge surface triangulations
Several extensions and improvements to surface merging procedures based on the extraction of isosurfaces from a distance map defined on an adaptive background grid are presented. The main objective is to extend the application of these algorithms to surfaces with sharp edges and comers. In order to deal with objects of different length scales, the initial background grids are created using a Delaunay triangulation method and local voxelizations. A point enrichment technique that introduces points into the background grid along detected surface features such as ridges is used to ensure that these features are preserved in the final merged surface. The surface merging methodology is extended to include other Boolean operations between surface triangulations. The iso-surface extraction algorithms are modified to obtain the correct iso-surface for multi-component objects. The procedures are demonstrated with various examples, ranging from simple geometrical entities to complex engineering applications. The present algorithms allow realistic modelling of a large number of complex engineering geometries using overlapping components defined discretely, i.e. via surface triangulations. This capability is very useful for grid generation starting from data originated in measurements or images
[ "feature-preserving volumetric technique", "merge surface triangulations", "surface triangulations", "surface merging procedures", "adaptive background grid", "sharp edges", "Delaunay triangulation method", "local voxelizations", "ridges", "Boolean operations", "iso-surfaces extraction", "multi-component objects", "simple geometrical entities", "complex engineering applications", "overlapping components", "images", "mesh generation", "unstructured grids", "discrete data", "surface intersection", "geometric modelling", "sharp comers", "point enrichment technique background grid", "arterial surfaces", "haemoglobin molecule" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "R", "M", "R", "R", "R", "M", "U" ]
1039
Design of an adaptive vibration absorber to reduce electrical transformer structural vibration
This paper considers the design of a vibration absorber to reduce structural vibration at multiple frequencies, with an enlarged bandwidth control at these target frequencies. While the basic absorber is a passive device a control system has been added to facilitate tuning, effectively giving the combination of a passive and active device, which leads to far greater stability and robustness. Experimental results demonstrating the effectiveness of the absorber are also described
[ "adaptive vibration absorber", "electrical transformer", "structural vibration", "bandwidth control" ]
[ "P", "P", "P", "P" ]
571
Control of transient thermal response during sequential open-die forging: a trajectory optimization approach
A trajectory optimization approach is applied to the design of a sequence of open-die forging operations in order to control the transient thermal response of a large titanium alloy billet. The amount of time the billet is soaked in furnace prior to each successive forging operation is optimized to minimize the total process time while simultaneously satisfying constraints on the maximum and minimum values of the billet temperature distribution to avoid microstructural defects during forging. The results indicate that a "differential" heating profile is the most effective at meeting these design goals
[ "open-die forging", "trajectory optimization", "titanium alloy billet", "temperature distribution", "microstructural defects", "heating profile", "transient thermal response control" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1161
Model theory for hereditarily finite superstructures
We study model-theoretic properties of hereditarily finite superstructures over models of not more than countable signatures. A question is answered in the negative inquiring whether theories of hereditarily finite superstructures which have a unique (up to isomorphism) hereditarily finite superstructure can be described via definable functions. Yet theories for such superstructures admit a description in terms of iterated families TF and SF. These are constructed using a definable union taken over countable ordinals in the subsets which are unions of finitely many complete subsets and of finite subsets, respectively. Simultaneously, we describe theories that share a unique (up to isomorphism) countable hereditarily finite superstructure
[ "model theory", "model-theoretic properties", "countable signatures", "iterated families", "definable union", "finitely many complete subsets", "countable hereditarily finite superstructure" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
1124
Data extraction from the Web based on pre-defined schema
With the development of the Internet, the World Wide Web has become an invaluable information source for most organizations. However, most documents available from the Web are in HTML form which is originally designed for document formatting with little consideration of its contents. Effectively extracting data from such documents remains a nontrivial task. In this paper, we present a schema-guided approach to extracting data from HTML pages. Under the approach, the user defines a schema specifying what to be extracted and provides sample mappings between the schema and the HTML page. The system will induce the mapping rules and generate a wrapper that takes the HTML page as input and produces the required data in the form of XML conforming to the user-defined schema. A prototype system implementing the approach has been developed. The preliminary experiments indicate that the proposed semi-automatic approach is not only easy to use but also able to produce a wrapper that extracts required data from inputted pages with high accuracy
[ "data extraction", "schema", "Internet", "information source", "HTML", "wrapper generation", "data integration", "distributed database", "queries" ]
[ "P", "P", "P", "P", "P", "R", "M", "U", "U" ]
118
Sensorless control of induction motor drives
Controlled induction motor drives without mechanical speed sensors at the motor shaft have the attractions of low cost and high reliability. To replace the sensor the information on the rotor speed is extracted from measured stator voltages and currents at the motor terminals. Vector-controlled drives require estimating the magnitude and spatial orientation of the fundamental magnetic flux waves in the stator or in the rotor. Open-loop estimators or closed-loop observers are used for this purpose. They differ with respect to accuracy, robustness, and sensitivity against model parameter variations. Dynamic performance and steady-state speed accuracy in the low-speed range can be achieved by exploiting parasitic effects of the machine. The overview in this paper uses signal flow graphs of complex space vector quantities to provide an insightful description of the systems used in sensorless control of induction motors
[ "sensorless control", "induction motor drives", "reliability", "stator voltages", "vector-controlled drives", "magnitude", "spatial orientation", "fundamental magnetic flux waves", "open-loop estimators", "closed-loop observers", "robustness", "sensitivity", "model parameter variations", "steady-state speed accuracy", "parasitic effects", "signal flow graphs", "space vector quantities", "stator currents" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
635
Detection and estimation of abrupt changes in the variability of a process
Detection of change-points in normal means is a well-studied problem. The parallel problem of detecting changes in variance has had less attention. The form of the generalized likelihood ratio test statistic has long been known, but its null distribution resisted exact analysis. In this paper, we formulate the change-point problem for a sequence of chi-square random variables. We describe a procedure that is exact for the distribution of the likelihood ratio statistic for all even degrees of freedom, and gives upper and lower bounds for odd (and also for non-integer) degrees of freedom. Both the liberal and conservative bounds for chi /sub 1//sup 2/ degrees of freedom are shown through simulation to be reasonably tight. The important problem of testing for change in the normal variance of individual observations corresponds to the chi /sub 1//sup 2/ case. The non-null case is also covered, and confidence intervals for the true change point are derived. The methodology is illustrated with an application to quality control in a deep level gold mine. Other applications include ambulatory monitoring of medical data and econometrics
[ "generalized likelihood ratio test statistic", "distribution", "sequence", "chi-square random variables", "even degrees of freedom", "lower bounds", "conservative bounds", "simulation", "individual observations", "confidence intervals", "quality control", "deep level gold mine", "ambulatory monitoring", "medical data", "econometrics", "abrupt change detection", "abrupt change estimation", "process variability", "upper bounds", "odd degrees of freedom", "noninteger degrees of freedom", "liberal bounds", "non null case" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "R", "M", "R", "M" ]
1260
A dataflow computer which accelerates execution of sequential programs by precedent firing instructions
In the dataflow machine, it is important to avoid degradation of performance in sequential processing, and it is important from the viewpoint of hardware scale to reduce the number of waiting operands. This paper demonstrates that processing performance is degraded by sequential processing in the switching process, and presents a method of remedy. Precedent firing control is proposed as a means of remedy, and it is shown by a simulation that the execution time and the total number of waiting operands can be reduced by the precedent firing control. Then the hardware scale is examined as an evaluation of precedent firing control
[ "dataflow computer", "sequential programs", "precedent firing instructions", "hardware scale", "waiting operands", "processing performance", "switching process", "precedent firing control", "execution time", "execution acceleration", "parallel processing", "computer architecture" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "M" ]
1225
BT voices its support for IP
BTexact's chief technology officer, Mick Reeve, gives his views on the future for voice over DSL services and virtual private networks, and defends the slow rollout of public access WLANs
[ "BTexact", "voice over DSL", "virtual private networks", "public access WLANs" ]
[ "P", "P", "P", "P" ]