id
stringlengths
1
4
title
stringlengths
13
200
abstract
stringlengths
67
2.93k
keyphrases
sequence
prmu
sequence
1217
A knowledge-based approach for managing urban infrastructures
This paper presents a knowledge e-based approach dedicated to the efficient management, regulation, interactive and dynamic monitoring of urban infrastructures. This approach identifies the data and related treatments common to several municipal activities and defines the requirements and functionalities of the computer tools developed to improve the delivery and coordination of municipal services to the population. The resulting cooperative system called SIGIU is composed of a set of integrated operating systems (SYDEX) and the global planning and coordination system (SYGEC). The objective is to integrate the set of SYDEX and the SYGEC into a single coherent system for all the SIGIU's users according to their tasks, their roles, and their responsibilities within the municipal administration. SIGIU is provided by different measurement and monitoring instruments installed on some system's elements to be supervised. In this context, the information can be presented in different forms: video, pictures, data and alarms. One of SIGIU's objectives is the real-time management of urban infrastructures' control mechanisms. To carry out this process, the alarm control agent creates a mobile agent associated with the alarm, which is sent to a mobile station and warns an operator. Preliminary implementation results show that SIGIU supports effectively and efficiently the decision making process related to managing urban infrastructures
[ "knowledge-based approach", "regulation", "dynamic monitoring", "municipal activities", "cooperative system", "SIGIU", "integrated operating systems", "SYDEX", "coordination system", "SYGEC", "video", "real-time management", "alarm control agent", "mobile agent", "urban infrastructure management", "global planning system", "intelligent decision support system", "urban planning", "multi-agent systems" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M", "R", "M" ]
642
Reconstruction of MR images from data acquired on an arbitrary k-space trajectory using the same-image weight
A sampling density compensation function denoted "same-image (SI) weight" is proposed to reconstruct MR images from the data acquired on an arbitrary k-space trajectory. An equation for the SI weight is established on the SI criterion and an iterative scheme is developed to find the weight. The SI weight is then used to reconstruct images from the data calculated on a random trajectory in a numerical phantom case and from the data acquired on interleaved spirals in an in vivo experiment, respectively. In addition, Pipe and Menon's weight (MRM 1999;41:179-186) is also used in the reconstructions to make a comparison. The images obtained with the SI weight were found to be slightly more accurate than those obtained with Pipe's weight
[ "arbitrary k-space trajectory", "same-image weight", "sampling density compensation", "random trajectory", "numerical phantom", "MRI image reconstruction", "spiral trajectory", "convolution function", "Nyquist sampling conditions", "iterative algorithm", "weighting function" ]
[ "P", "P", "P", "P", "P", "M", "R", "M", "M", "M", "R" ]
607
A building block approach to automated engineering
Shenandoah Valley Electric Cooperative (SVEC, Mt. Crawford, Virginia, US) recognized the need to automate engineering functions and create an interactive model of its distribution system in the early 1990s. It had used Milsoft's DA software for more than 10 years to make engineering studies, and had a Landis and Gyr SCADA system and a hybrid load management system for controlling water heater switches. With the development of GIS and facilities management (FM) applications, SVEC decided this should be the basis for an information system that would model its physical plant and interface with its accounting and billing systems. It could add applications such as outage management, staking, line design and metering to use this information and interface with these databases. However, based on SVEC's size it was not feasible to implement a sophisticated and expensive GIS/FM system. Over the past nine years, SVEC has had success with a building block approach, and its customers and employees are realizing the benefits of the automated applications. This building block approach is discussed in this article including the GIS, outage management system, MapViewer and a staking package. The lessons learned and future expansion are discussed
[ "building block approach", "GIS", "Shenandoah Valley Electric Cooperative", "interactive model", "distribution system", "billing systems", "outage management", "staking", "line design", "metering", "databases", "MapViewer", "engineering functions automation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
78
Applying genetic algorithms to solve the fuzzy optimal profit problem
This study investigated the application of genetic algorithms in solving a fuzzy optimization problem that arises in business and economics. In this problem, a fuzzy price is determined using a linear or a quadratic fuzzy demand function as well as a linear cost function. The objective is to find the optimal fuzzy profit, which is derived from the fuzzy price and fuzzy cost. Traditional methods for solving this problem are (1) the extension principle, and (2) using interval arithmetic and alpha -cuts. However, we argue that traditional methods for solving this problem are too restrictive to produce an optimal solution, and that an alternative approach is possibly needed. We use genetic algorithms to obtain an approximate solution for this fuzzy optimal profit problem without using membership functions. We not only give empirical examples to show the effectiveness of this approach, but also give theoretical proofs to validate correctness of the algorithm. We conclude that genetic algorithms can produce good approximate solutions when applied to solve fuzzy optimization problems
[ "genetic algorithms", "fuzzy optimal profit problem", "fuzzy optimization problem", "business", "economics", "fuzzy price", "quadratic fuzzy demand function", "linear cost function", "approximate solution", "theoretical proofs", "linear fuzzy demand function", "algorithm correctness validation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
1009
Robust output feedback model predictive control using off-line linear matrix inequalities
A fundamental question about model predictive control (MPC) is its robustness to model uncertainty. In this paper, we present a robust constrained output feedback MPC algorithm that can stabilize plants with both polytopic uncertainty and norm-bound uncertainty. The design procedure involves off-line design of a robust constrained state feedback MPC law and a state estimator using linear matrix inequalities (LMIs). Since we employ an off-line approach for the controller design which gives a sequence of explicit control laws, we are able to analyze the robust stabilizability of the combined control laws and estimator, and by adjusting the design parameters, guarantee robust stability of the closed-loop system in the presence of constraints. The algorithm is illustrated with two examples
[ "robust output feedback model predictive control", "off-line linear matrix inequalities", "robust constrained output feedback MPC algorithm", "polytopic uncertainty", "norm-bound uncertainty", "robust constrained state feedback MPC law", "state estimator", "closed-loop system", "model uncertainty robustness", "controller design procedure", "explicit control law sequence", "asymptotically stable invariant ellipsoid" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "U" ]
718
New water management system begins operation at US projects
The US Army Corps of Engineers has developed a new automated information system to support its water control management mission. The new system provides a variety of decision support tools, enabling water control managers to acquire, transform, verify, store, display, analyse, and disseminate data and information efficiently and around the clock
[ "water management system", "US projects", "US Army Corps of Engineers", "automated information system", "water control management mission", "water control managers", "decision support tools", "watershed modelling", "data dissemination", "data acquisition", "data storage", "data verification", "data display", "data analysis", "data visualization", "decision support system", "Corps Water Management System" ]
[ "P", "P", "P", "P", "P", "P", "P", "U", "R", "M", "M", "M", "R", "M", "M", "R", "R" ]
1308
SPTL/BIALL academic law library survey 2000/2001
The paper outlines the activities and funding of academic law libraries in the UK and Ireland in the academic year 2000/2001. The figures have been taken from the results of a postal questionnaire undertaken by information services staff at Cardiff University on behalf of BIALL
[ "SPTL/BIALL", "academic law libraries", "survey", "funding", "UK", "Ireland", "postal questionnaire", "information services", "Cardiff University" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1425
Kontiki. Shortcuts for content's trip to the edge
When electronic files get zapped from one location to another, you probably aren't thinking about the physical distance they must travel-or how that distance might affect the time it takes to get there. But if you work for CDN company Kontiki, this is just about all you think about. Championing a P2P-like "bandwidth harvesting" technology, Kontiki has figured out how to not only quickly distribute content to the "edge" but to utilize a combination of centralized servers and a network of enduser machines to collect, or "harvest," underutilized bandwidth and make redundant file requests more efficient
[ "Kontiki", "electronic files", "centralized servers", "enduser machines", "underutilized bandwidth", "redundant file requests", "P2P-like bandwidth harvesting technology" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1460
Detection of flaws in composites from scattered elastic-wave field using an improved mu GA and a local optimizer
An effective technique for flaw detection of composites is proposed. In this technique, the detection problem is formulated as an optimization problem minimizing the difference between the measured and calculated surface displacement response derived from scattered elastic-wave fields. A combined optimization technique using an improved mu GA and a local optimizer is developed to solve the optimization problem so as to obtain the flaw parameters defining flaw configurations. Guidelines for implementing the detection technique, including formulation of the objective function of the optimization problem using different error norms, improvement of mu GA convergence performance, switching from mu GA to local optimizer in the optimization process, and suppression of the effect of noise on detection results, are addressed in detail. Numerical examples are presented to demonstrate the effectiveness and efficiency of the proposed detection technique
[ "composites", "scattered elastic-wave field", "improved mu GA", "local optimizer", "flaw detection", "optimization problem", "surface displacement response", "flaw configurations", "objective function", "error norms", "convergence", "noise effect suppression" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
876
Perspectives on academic vs. industry environments for women in computer science
The authors were tenure track faculty members at the Colorado School of Mines and later moved into senior positions at software companies. Both are part of two-career couples as well, and both have two children. In this article, they discuss their impressions and share anecdotes regarding the differing experiences of women and families in these two environments
[ "industry environment", "women", "computer science", "faculty members", "software companies", "children", "academic environments", "career", "gender gap" ]
[ "P", "P", "P", "P", "P", "P", "R", "U", "U" ]
833
Packet promises past & present [IP switching]
With the death of the competitive carrier market and the significant slashing of RBOC capex budgets, softswitch vendors have been forced to retrench. Now instead of focusing primarily on limited Internet off-load applications, packet-based softswitches are set to gel around real user needs for services such as voice over IP and IP Centrex
[ "softswitch vendors", "voice over IP", "IP Centrex" ]
[ "P", "P", "P" ]
1250
The impact and implementation of XML on business-to-business commerce
This paper discusses the impact analysis of the Extensible Markup Language (XML). Each business partner within a supply chain will be allowed to generate its own data exchange format by adopting an XML meta-data management system in the local side. Followed after a brief introduction of the information technology for Business to Customer (B2C) and Business to Business (B2B) Electronic Commerce (EC), the impact of XML on the tomorrow business world is discussed. A real case study for impact analysis on information exchange platform, Microsoft's BizTalk platform which is actually an XML schema builder and the implementation of XML commerce application will provide an interest insight for users' future implementation
[ "XML", "Extensible Markup Language", "Business to Customer", "Business to Business", "electronic commerce", "BizTalk", "XML schema builder", "Electronic Data Interchange", "Enterprise Resources Planning" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "U" ]
1215
A knowledge-based approach for business process reengineering, SHAMASH
We present an overview of SHAMASH, a process modelling tool for business process reengineering. The main features that differentiate it from most current related tools are its ability to define and use organisation standards, functional structure, and develop automatic model simulation and optimisation. SHAMASH is a knowledge-based system, and we include a discussion on how knowledge acquisition takes place. Furthermore, we introduce a high level description of the architecture, the conceptual model, and other important modules of the system
[ "knowledge-based approach", "business process reengineering", "SHAMASH", "process modelling tool", "organisation standards", "functional structure", "automatic model simulation", "optimisation", "knowledge-based system", "knowledge acquisition", "conceptual model" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
640
Scribe: a large-scale and decentralized application-level multicast infrastructure
This paper presents Scribe, a scalable application-level multicast infrastructure. Scribe supports large numbers of groups, with a potentially large number of members per group. Scribe is built on top of Pastry, a generic peer-to-peer object location and routing substrate overlayed on the Internet, and leverages Pastry's reliability, self-organization, and locality properties. Pastry is used to create and manage groups and to build efficient multicast trees for the dissemination of messages to each group. Scribe provides best-effort reliability guarantees, and we outline how an application can extend Scribe to provide stronger reliability. Simulation results, based on a realistic network topology model, show that Scribe scales across a wide range of groups and group sizes. Also, it balances the load on the nodes while achieving acceptable delay and link stress when compared with Internet protocol multicast
[ "Scribe", "decentralized application-level multicast infrastructure", "scalable application-level multicast infrastructure", "Pastry", "generic peer-to-peer object location", "Internet", "self-organization", "locality properties", "best-effort reliability guarantees", "simulation results", "network topology model", "group size", "delay", "link stress", "Internet protocol multicast", "generic routing substrate", "discrete event simulator", "network nodes" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R" ]
605
HEW selects network management software
For more than 100 years, Hamburgische Electricitats-Werke AG (HEW) has provided a reliable electricity service to the city of Hamburg, Germany. Today, the company supplies electricity to some 1.7 million inhabitants via 285000 connections. During 1999, the year the energy market was started in Germany, HEW needed to operate and maintain a safe and reliable network cheaply. The development and implementation of a distribution management system (DMS) is key to the success of HEW. HEW's strategy was to obtain efficient new software for network management that also offered a good platform for future applications. Following a pilot and prequalification phase, HEW invited several companies to process the requirements catalog and to submit a detailed tender. The network information management system, Xpower, developed by Tekla Oyj, successfully passed HEW's test program and satisfied all the performance and system capacity requirements. The system met all HEW's conditions by presenting the reality of a network with the attributes of the operating resources. Xpower platform provides the ability to integrate future applications
[ "network management software", "Hamburgische Electricitats-Werke", "Hamburg", "Germany", "distribution management system", "Xpower", "Tekla Oyj" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
128
A new result on the global convergence of Hopfield neural networks
In this work, we discuss Hopfield neural networks, investigating their global stability. Some sufficient conditions for a class of Hopfield neural networks to be globally stable and globally exponentially stable are given
[ "Hopfield neural networks", "global stability", "sufficient conditions", "globally exponentially stable networks" ]
[ "P", "P", "P", "R" ]
1151
A method for geometrical verification of dynamic intensity modulated radiotherapy using a scanning electronic portal imaging device
In order to guarantee the safe delivery of dynamic intensity modulated radiotherapy (IMRT), verification of the leaf trajectories during the treatment is necessary. Our aim in this study is to develop a method for on-line verification of leaf trajectories using an electronic portal imaging device with scanning read-out, independent of the multileaf collimator. Examples of such scanning imagers are electronic portal imaging devices (EPIDs) based on liquid-filled ionization chambers and those based on amorphous silicon. Portal images were acquired continuously with a liquid-filled ionization chamber EPID during the delivery, together with the signal of treatment progress that is generated by the accelerator. For each portal image, the prescribed leaf and diaphragm positions were computed from the dynamic prescription and the progress information. Motion distortion effects of the leaves are corrected based on the treatment progress that is recorded for each image row. The aperture formed by the prescribed leaves and diaphragms is used as the reference field edge, while the actual field edge is found using a maximum-gradient edge detector. The errors in leaf and diaphragm position are found from the deviations between the reference field edge and the detected field edge. Earlier measurements of the dynamic EPID response show that the accuracy of the detected field edge is better than 1 mm. To ensure that the verification is independent of inaccuracies in the acquired progress signal, the signal was checked with diode measurements beforehand. The method was tested on three different dynamic prescriptions. Using the described method, we correctly reproduced the distorted field edges. Verifying a single portal image took 0.1 s on an 866 MHz personal computer. Two flaws in the control system of our experimental dynamic multileaf collimator were correctly revealed with our method. First, the errors in leaf position increase with leaf speed, indicating a delay of approximately 0.8 s in the control system. Second, the accuracy of the leaves and diaphragms depends on the direction of motion. In conclusion, the described verification method is suitable for detailed verification of leaf trajectories during dynamic IMRT
[ "dynamic intensity modulated radiotherapy", "safe delivery", "leaf trajectories", "on-line verification", "scanning read-out", "liquid-filled ionization chambers", "diaphragm positions", "motion distortion effects", "reference field edge", "distorted field edges", "control system", "dynamic multileaf collimator", "leaf positions", "geometrical verification method", "dose distributions", "treatment planning" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "M" ]
1114
A new algebraic modelling approach to distributed problem-solving in MAS
This paper is devoted to a new algebraic modelling approach to distributed problem-solving in multi-agent systems (MAS), which is featured by a unified framework for describing and treating social behaviors, social dynamics and social intelligence. A conceptual architecture of algebraic modelling is presented. The algebraic modelling of typical social behaviors, social situation and social dynamics is discussed in the context of distributed problem-solving in MAS. The comparison and simulation on distributed task allocations and resource assignments in MAS show more advantages of the algebraic approach than other conventional methods
[ "algebraic modelling approach", "distributed problem-solving", "multi-agent systems", "unified framework", "social behaviors", "social dynamics", "social intelligence", "distributed task allocations", "resource assignments" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
541
Virtual-reality-based multidimensional therapy for the treatment of body image disturbances in binge eating disorders: a preliminary controlled study
The main goal of this paper is to preliminarily evaluate the efficacy of a virtual-reality (VR)-based multidimensional approach in the treatment of body image attitudes and related constructs. The female binge eating disorder (BED) patients (n=20), involved in a residential weight control treatment including low-calorie diet (1200 cal/day) and physical training, were randomly assigned either to the multidimensional VR treatment or to psychonutritional groups based on the cognitive-behavior approach. Patients were administered a battery of outcome measures assessing eating disorders symptomathology, attitudes toward food, body dissatisfaction, level of anxiety, motivation for change, level of assertiveness, and general psychiatric symptoms. In the short term, the VR treatment was more effective than the traditional cognitive-behavioral psychonutritional groups in improving the overall psychological state of the patients. In particular, the therapy was more effective in improving body satisfaction, self-efficacy, and motivation for change. No significant differences were found in the reduction of the binge eating behavior. The possibility of inducing a significant change in body image and its associated behaviors using a VR-based short-term therapy can be useful to improve the body satisfaction in traditional weight reduction programs. However, given the nature of this research that does not include a followup study, the obtained results are preliminary only
[ "multidimensional therapy", "body image disturbances", "binge eating disorders", "residential weight control treatment", "psychonutritional groups", "cognitive-behavior approach", "anxiety", "psychiatric symptoms", "virtual reality", "obesity", "patient therapy" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "U", "U", "R" ]
996
Flexible air-jet tooling for vibratory bowl feeder systems
Vibratory bowl feeders (VBFs) are machines that feed various small parts in large volume automatic assembly systems. Their shortcomings, like inflexibility and the propensity to jam, stem from the use of mechanical orienting devices. Air jet based orienting devices can be implemented to overcome these limitations. Applications of passive and active air jet based orienting devices that replace conventional devices for the VBF are discussed. Passive devices, which reject incorrectly oriented parts, are discussed first. Active air jet based orienting devices are then introduced to further improve the flexibility of VBFs. Since active devices reorient parts into a desired orientation, the part motion under their influence is analyzed. A number of tests demonstrate the feasibility and advantages of these new orienting devices
[ "vibratory bowl feeders", "automatic assembly systems", "orienting devices", "active air jet", "passive air jet", "parts feeding" ]
[ "P", "P", "P", "P", "R", "R" ]
87
Positional control of pneumatic manipulators for construction tasks
This paper describes solutions that can be applied to pneumatic manipulator problems in positioning, both for angle trajectories and for long linear trajectories, used in construction tasks. Optimal positioning of a pneumatic manipulator along angle trajectories with minimum control energy consumption is given. The implementation of the control system is presented. Control algorithms for a long linear trajectory manipulator based on two-phase and three-phase motion modes of the end-effector are investigated. Conventional and fuzzy logic controls of a pneumatic manipulator were applied and experimental testing was carried out. The obtained results allow widening the application range of pneumatic manipulators in construction, particularly in gantry type machines
[ "positioning", "positional control", "pneumatic manipulators", "construction tasks", "angle trajectories", "long linear trajectory manipulator", "three-phase motion modes", "fuzzy logic controls", "gantry type machines", "two-phase motion modes" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
1129
Computing stationary Nash equilibria of undiscounted single-controller stochastic games
Given a two-person, nonzero-sum stochastic game where the second player controls the transitions, we formulate a linear complementarity problem LCP(q, M) whose solution gives a Nash equilibrium pair of stationary strategies under the limiting average payoff criterion. The matrix M constructed is of the copositive class so that Lemke's algorithm will process it. We will also do the same for a special class of N-person stochastic games called polymatrix stochastic games
[ "stationary Nash equilibria", "undiscounted single-controller stochastic games", "linear complementarity problem", "stationary strategies", "limiting average payoff criterion", "N-person stochastic games", "polymatrix stochastic games", "two-person nonzero-sum stochastic game", "copositive class matrix", "Lemke algorithm" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
539
Perfusion quantification using Gaussian process deconvolution
The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated as a constraint in the method. The GPD method, which automatically estimates the noise level in each voxel, has the advantage that model parameters are optimized automatically. The GPD is compared to singular value decomposition (SVD) using a common threshold for the singular values, and to SVD using a threshold optimized according to the noise level in each voxel. The comparison is carried out using artificial data as well as data from healthy volunteers. It is shown that GPD is comparable to SVD with a variable optimized threshold when determining the maximum of the IRF, which is directly related to the perfusion. GPD provides a better estimate of the entire IRF. As the signal-to-noise ratio (SNR) increases or the time resolution of the measurements increases, GPD is shown to be superior to SVD. This is also found for large distribution volumes
[ "perfusion quantification", "Gaussian process deconvolution", "dynamic susceptibility contrast MRI", "residual impulse response function", "noise level", "singular value decomposition", "optimized model parameters", "capillary blood flow", "mean transit time", "optimized joint Gaussian distribution", "correlation length", "likelihood function" ]
[ "P", "P", "P", "P", "P", "P", "R", "U", "M", "M", "U", "M" ]
680
Information needs of the working journalists in Orissa: a study
Provides an insight into the various information needs of working journalists in Orissa. Analyses data received from 226 working journalists representing 40 newspaper organisations. Also depicts the specialisation of working journalists, their frequency of information requirement, mode of dissemination preferred, information sources explored, mode of services opted, and their information privations. The study asserts that subjects primarily concerned with the professional work and image of the working journalists are rated utmost significant
[ "information needs", "working journalists", "newspaper organisations", "information requirement", "information sources", "professional work", "data analysis", "information dissemination" ]
[ "P", "P", "P", "P", "P", "P", "M", "R" ]
1290
Making the MIS integration process work
Focused, cross-functional teams that implement flexible and scalable information systems (IS) can deliver a smooth, lean manufacturing process. When integrating new technology into an existing facility, one should always consider three things: the hard infrastructure, the soft infrastructure, and information flow. Hard infrastructure includes client and server hardware and network infrastructure. Soft infrastructure includes operating systems, existing or legacy software, needed code customizations, and the human resources to run/support the system. Information flow includes how data in the new system interacts with legacy systems and what legacy data the new system will require, as well as who will want to receive/access the information that is held by the system
[ "scalable information systems", "lean manufacturing process", "information flow", "network infrastructure", "legacy software", "human resources", "management information systems", "client server hardware" ]
[ "P", "P", "P", "P", "P", "P", "M", "R" ]
1228
Outsourced backup saves time
To increase the efficiency of its data backup and to free staff to concentrate on core business, The Gadget Shop is relying on a secure, automated system hosted by a third party
[ "outsourced", "data backup", "The Gadget Shop", "e-business" ]
[ "P", "P", "P", "U" ]
638
Scalable secure group communication over IP multicast
We introduce and analyze a scalable rekeying scheme for implementing secure group communications Internet protocol multicast. We show that our scheme incurs constant processing, message, and storage overhead for a rekey operation when a single member joins or leaves the group, and logarithmic overhead for bulk simultaneous changes to the group membership. These bounds hold even when group dynamics are not known a priori. Our rekeying algorithm requires a particular clustering of the members of the secure multicast group. We describe a protocol to achieve such clustering and show that it is feasible to efficiently cluster members over realistic Internet-like topologies. We evaluate the overhead of our own rekeying scheme and also of previously published schemes via simulation over an Internet topology map containing over 280 000 routers. Through analysis and detailed simulations, we show that this rekeying scheme performs better than previous schemes for a single change to group membership. Further, for bulk group changes, our algorithm outperforms all previously known schemes by several orders of magnitude in terms of actual bandwidth usage, processing costs, and storage requirements
[ "scalable secure group communication", "IP multicast", "Internet protocol multicast", "storage overhead", "overhead", "logarithmic overhead", "simulation", "group membership", "group dynamics", "rekeying algorithm", "secure multicast group", "Internet-like topologies", "Internet topology map", "bandwidth usage", "processing costs", "storage requirements", "cryptography", "access control server", "authentication", "network routers" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U", "U", "M" ]
913
Control of a coupled map lattice model for vortex shedding in the wake of a cylinder
The flow behind a vibrating flexible cable at low Reynolds numbers can exhibit complex wake structures such as lace-like patterns, vortex dislocations and frequency cells. These structures have been observed in experiments and numerical simulations, and are predicted by a previously developed low-order coupled map lattice (CML). The discrete (in time and space) CML models consist of a series of diffusively coupled circle map oscillators along the cable span. Motivated by a desire to modify the complex wake patterns behind flexible vibrating cables, we have studied the addition of control terms into the highly efficient CML models and explored the resulting dynamics. Proportional, adaptive proportional and discontinuous non-linear (DNL) control methods were used to derive the control laws. The first method employed occasional proportional feedback. The adaptive method used spatio-temporal feedback control. The DNL method used a discontinuous feedback linearization procedure, and the controller was designed for the resulting linearized system using eigenvalue assignment. These techniques were applied to a modeled vortex dislocation structure in the wake of a vibrating cable in uniform freestream flow. Parallel shedding patterns were achieved for a range of forcing frequency-forcing amplitude combinations studied to validate the control theory. The adaptive proportional and DNL methods were found to be more effective than the proportional control method due to the incorporation of a spatially varying feedback gain across the cylinder span. The DNL method was found to be the most efficient controller of the low-order CML model. The required control level across the cable span was correlated to the 1/1 lock-on behavior of the temporal circle map
[ "coupled map lattice", "vortex shedding", "wake", "cylinder", "vibrating flexible cable", "low Reynolds numbers", "vortex dislocation", "vortex dislocation", "coupled circle map oscillators", "proportional feedback", "spatio-temporal feedback control", "1/1 lock-on", "temporal circle map", "discontinuous nonlinear control", "vortex dislocations" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "P" ]
581
Successive expansion method of network planning applying symbolic analysis method
The conventional power system network successive expansion planning method is discussed in the context of the new paradigm of competitive electric power, energy and service market. In sequel, the paper presents an application of the conceptually new computer program, based on the symbolic analysis of load flows in power system networks. The network parameters and variables are defined as symbols. The symbolic analyzer, which models analytically the power system DC load flows, enables the sensitivity analysis of the power system to parameter and variable variations (costs, transfers, injections), a valuable tool for the expansion planning analysis. That virtue could not be found within the conventional approach, relying on compensation methods, precalculated distribution factors, and so on. This novel application sheds some light on the traditional power system network expansion planning method, as well as on its possible application within the system network expansion planning in the new environment assuming the competitive electric power market
[ "symbolic analysis", "power system network successive expansion planning", "computer program", "load flows", "symbolic analyzer", "power system DC load flows", "sensitivity analysis", "compensation methods", "precalculated distribution factors", "power system network expansion planning method", "competitive electric power market", "competitive electric energy market", "competitive electric service market" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
956
Do you see what I see? [visual technology in law firms]
Think of how well-done computer presentations can aid in the learning experience. They are, however, less common in client meetings, settlement conferences and the courtroom. And you have to wonder why, when the same benefits of attention focus and visual learning apply in those legal communication settings. The software and hardware components are easy to use, and they're increasingly affordable to boot. The next time you need to convey a point to an audience (be it one person or many), think of how you might benefit from the visual impact available through presentation software like PowerPoint. Anyone will understand you more easily when assisted by visual input, and it may make all the difference in reaching visual-focused learners
[ "visual technology", "law firms", "computer presentations", "PowerPoint" ]
[ "P", "P", "P", "P" ]
115
Non-optimal universal quantum deleting machine
We verify the non-existence of some standard universal quantum deleting machine. Then a non-optimal universal quantum deleting machine is constructed and we emphasize the difficulty for improving its fidelity. In a way, our results complement the universal quantum cloning machine established by Buzek and Hillery (1996), and manifest some of their distinctions
[ "fidelity", "universal quantum cloning machine", "nonoptimal universal quantum deleting machine", "NUQDM" ]
[ "P", "P", "M", "U" ]
1191
On the monotonicity conservation in numerical solutions of the heat equation
It is important to choose such numerical methods in practice that mirror the characteristic properties of the described process beyond the stability and convergence. The investigated qualitative property in this paper is the conservation of the monotonicity in space of the initial heat distribution. We prove some statements about the monotonicity conservation and total monotonicity of one-step vector-iterations. Then, applying these results, we consider the numerical solutions of the one-dimensional heat equation. Our main theorem formulates the necessary and sufficient condition of the uniform monotonicity conservation. The sharpness of the conditions is demonstrated by numerical examples
[ "monotonicity conservation", "numerical solutions", "heat equation", "characteristic properties", "qualitative property", "one-step vector-iterations", "necessary and sufficient condition" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
725
Banks pin their back-office hopes on successors to screen scrapers
The big name in account aggregation has been Yodlee, based in Redwood Shores, CA. It pioneered the art of screen scraping, or pulling data off Web sites and aggregating it into a single statement. That data, however, is a snapshot and does not include a customer's investment history. Also, because Web sites update data at different times, scraping them can provide an inaccurate picture of a customer's financial situation, making it difficult for reps seeking to provide timely and accurate advice. The objective is to access both fresh and historical data across a client's financial spectrum, from investments to checking accounts and loans to insurance policies, a Complete Customer balance sheet. At least two technology vendors are progressing in that direction, each coming from different directions. One is Advent, based in San Francisco, another is Fincentric, out of Vancouver
[ "screen scraping", "account aggregation", "Yodlee", "Web sites", "investment", "checking", "loans", "insurance", "Advent", "Fincentric", "bankers" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
760
An improved fuzzy MCDM model based on ideal and anti-ideal concepts
Liang presented (1999) a fuzzy multiple criteria decision making (MCDM) method based on the concepts of ideal and anti-ideal points. Despite its merits, Liang method has the following limitations: (i) the objective criteria are converted into dimensionless indices and the subjective criteria are not converted, which may prevent compatibility for these criteria, (ii) the formulas for converting objective criteria are not reliable, and (iii) an unreliable ranking method, i.e. maximizing set and minimizing set, is applied to rank the fuzzy numbers. This paper applies the Hsu and Chen method and suggests a fuzzy number ranking method to propose an improved fuzzy MCDM model based on ideal and anti-ideal concepts to overcome the shortcomings of the Liang method. Numerical examples demonstrate the effectiveness and feasibility of the proposed ranking method and the improved model, respectively
[ "fuzzy MCDM model", "anti-ideal concepts", "dimensionless indices", "fuzzy number ranking", "ideal concepts", "multicriterion decision-making" ]
[ "P", "P", "P", "P", "R", "U" ]
1335
Arranging solid balls to represent a graph
By solid balls, we mean a set of balls in R/sup 3/ no two of which can penetrate each other. Every finite graph G can be represented by arranging solid balls in the following way: Put red balls in R/sup 3/, one for each vertex of G, and connect two red balls by a chain when they correspond to a pair of adjacent vertices of G, where a chain means a finite sequence of blue solid balls in which each consecutive balls are tangent. (We may omit the chain if the two red balls are already tangent.) The ball number b(G) of G is the minimum number of balls (red and blue) necessary to represent G. If we put the balls and chains on a table so that all balls sit on the table, then the minimum number of balls for G is denoted by bT(G). Among other things, we prove that b(K/sub 6/) = 8, b(K/sub 7/) = 13 and b/sub T/(K/sub 5/) = 8,b/sub T/(K/sub 6/) = 14. We also prove that c/sub 1/n/sup 3/ < b(K/sub n/) < c/sub 2/n/sup 3/ log n, c/sub 3/n/sup 4//log n < b/sub T/(K/sub n/) < c/sub 4/n/sup 4/
[ "solid balls", "finite graph", "adjacent vertices", "finite sequence", "graph representation" ]
[ "P", "P", "P", "P", "M" ]
1370
Integrated support based on task models for the design, evaluation, and documentation of interactive safety-critical systems: a case study in the air-traffic control domain
This paper presents an approach to using task models in both the design and the evaluation phases of interactive safety-critical applications. We explain how it is possible to use information contained in task models to support the design and development of effective user interfaces. Moreover, we show how task models can also support a systematic inspection-based usability assessment by examining possible deviations that can occur while users interact with the system, an important issue especially when coping with the peculiar requirements of safety-critical applications. Such evaluation provides useful technical documentation to help users achieve an in-depth understanding of the system and its design rationale. Lastly, a description of the application of our approach to a real case study in the air-traffic control domain will illustrate the main features of the proposed method. In particular, we discuss examples taken from an application for air-traffic controllers in an aerodrome supported by graphical user interfaces for data-link communications with pilots
[ "integrated support", "task models", "interactive safety-critical systems", "air-traffic control domain", "user interfaces", "inspection-based usability assessment", "technical documentation", "graphical user interfaces", "data-link communications" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1418
Documentum completes CM Trifecta
Daily, people participating in clinical trials for drug companies fill out forms describing how they feel physically and emotionally. For some trials, there are hundreds, possibly thousands, of participants. The drug companies must compile all the forms and submit them electronically to the FDA. That's where Documentum comes in. "We've streamlined the whole process of managing clinical trial content for companies, such as Johnson & Johnson, Bristol Myers Squibb, and Pfizer," notes Documentum's president and CEO Dave De Walt. "And by the way, the FDA also is one of our customers, as well as the EPA and the FAA." And there are about 1,300 other organizations in various industries worldwide that rely on Documentum's technologies, consulting, and training services. The company's products are designed to manage digital content and facilitate online transactions, partner and supplier relationships, and ebusiness interactions
[ "Documentum", "clinical trials", "drug companies", "FDA", "clinical trial content", "training services", "consulting services" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1034
Vibration control of the rotating flexible-shaft/multi-flexible-disk system with the eddy-current damper
In this paper, the rotating flexible-Timoshenko-shaft/flexible-disk coupling system is formulated by applying the assumed-mode method into the kinetic and strain energies, and the virtual work done by the eddy-current damper. From Lagrange's equations, the resulting discretized equations of motion can be simplified as a bilinear system (BLS). Introducing the control laws, including the quadratic, nonlinear and optimal feedback control laws, into the BLS, it is found that the eddy-current damper can be used to suppress flexible and shear vibrations simultaneously, and the system is globally asymptotically stable. Numerical results are provided to validate the theoretical analysis
[ "rotating flexible-shaft/multi-flexible-disk system", "eddy-current damper", "rotating flexible-Timoshenko-shaft/flexible-disk coupling system", "assumed-mode method", "virtual work", "Lagrange's equations", "discretized equations of motion", "bilinear system", "optimal feedback control laws", "shear vibrations", "quadratic feedback control laws", "nonlinear feedback control laws", "flexible vibrations" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1071
Dense coding in entangled states
We consider the dense coding of entangled qubits shared between two parties, Alice and Bob. The efficiency of classical information gain through quantum entangled qubits is also considered for the case of pairwise entangled qubits and maximally entangled qubits. We conclude that using the pairwise entangled qubits can be more efficient when two parties communicate whereas using the maximally entangled qubits can be more efficient when the N parties communicate
[ "dense coding", "entangled states", "Alice", "Bob", "pairwise entangled qubits", "maximally entangled qubits", "classical information gain efficiency", "quantum information processing", "quantum communication" ]
[ "P", "P", "P", "P", "P", "P", "R", "M", "R" ]
937
Use of neural networks in the analysis of particle size distribution by laser diffraction: tests with different particle systems
The application of forward light scattering methods for estimating the particle size distribution (PSD) is usually limited by the occurrence of multiple scattering, which affects the angular distribution of light in highly concentrated suspensions, thus resulting in false calculations by the conventionally adopted algorithms. In this paper, a previously proposed neural network-based method is tested with different particle systems, in order to evaluate its applicability. In the first step of the study, experiments were carried out with solid-liquid suspensions having different characteristics of particle shape and size distribution, under varying solid concentrations. The experimental results, consisting of the angular distribution of light intensity, particle shape and suspension concentration, were used as input data in the fitting of neural network models (NN) that replaced the optical model to provide the PSD. The reference values of particle shape and PSD for the NN fitting were based on image analysis. Comparisons between the PSD values computed by the NN model and the reference values indicate that the method can be used in monitoring the PSD of particles with different shapes in highly concentrated suspensions, thus extending the range of application of forward laser diffraction to a number of systems with industrial interest
[ "particle size distribution", "laser diffraction", "forward light scattering", "multiple scattering", "angular distribution of light", "solid-liquid suspensions", "neural network modeling", "image analysis", "particle shape distribution", "pattern recognition", "powdered materials", "backpropagation algorithm", "Fraunhofer optical model", "fluidized catalytic cracking" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "U", "M", "M", "U" ]
972
VoIP: leveraging existing cable architecture
As operators prepare to enter the voice-over-IP fray, they are searching for ways to leverage their existing two-way, interactive infrastructure. There are several approaches for supporting VoIP on top of the core IP transport network. The one garnering the most interest, especially in the United States, is based on the PacketCable 1.x architecture. This article discusses the PacketCable-based approach
[ "VoIP", "cable architecture", "voice-over-IP", "core IP transport network", "United States", "PacketCable 1.x architecture", "PacketCable-based approach", "two-way interactive infrastructure" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
131
On biorthogonal nonuniform filter banks and tree structures
This paper concerns biorthogonal nonuniform filter banks. It is shown that a tree structured filter bank is biorthogonal if it is equivalent to a tree structured filter bank whose matching constituent levels on the analysis and synthesis sides are themselves biorthogonal pairs. We then show that a stronger statement can be made about dyadic filter banks in general: That a dyadic filter bank is biorthogonal if both the analysis and synthesis banks can be decomposed into dyadic trees. We further show that these decompositions are stability and FIR preserving. These results, derived for filter banks having filters with rational transfer functions, thus extend some of the earlier comparable results for orthonormal filter banks
[ "biorthogonal nonuniform filter banks", "tree structured filter bank", "biorthogonal pairs", "dyadic filter banks", "dyadic trees", "FIR preserving", "rational transfer functions", "stability preserving" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
1249
Aggregators versus disintermediators: battling it out in the information superhighstreet
Perhaps the future of large-scale content aggregators is now no longer in doubt but this was not the case 10 years ago, when many leading industry experts were much more pessimistic in their predictions. In the year that Dialog celebrates its thirtieth anniversary as the world's oldest and largest professional online information service, it is appropriate to look back at these changing perceptions, the reasons for these changes, and why the experts got it wrong. We also look at the present day; the value that large-scale content aggregators bring to the information supply chain; and we discuss why users would choose to use aggregators as opposed to going directly to the publishers
[ "disintermediators", "large-scale content aggregators", "online information service", "information supply chain" ]
[ "P", "P", "P", "P" ]
63
Geometric source separation: merging convolutive source separation with geometric beamforming
Convolutive blind source separation and adaptive beamforming have a similar goal-extracting a source of interest (or multiple sources) while reducing undesired interferences. A benefit of source separation is that it overcomes the conventional cross-talk or leakage problem of adaptive beamforming. Beamforming on the other hand exploits geometric information which is often readily available but not utilized in blind algorithms. We propose to join these benefits by combining cross-power minimization of second-order source separation with geometric linear constraints used in adaptive beamforming. We find that the geometric constraints resolve some of the ambiguities inherent in the independence criterion such as frequency permutations and degrees of freedom provided by additional sensors. We demonstrate the new method in performance comparisons for actual room recordings of two and three simultaneous acoustic sources
[ "geometric source separation", "geometric beamforming", "convolutive blind source separation", "adaptive beamforming", "cross-talk", "leakage problem", "blind algorithms", "cross-power minimization", "second-order source separation", "geometric linear constraints", "frequency permutations", "degrees of freedom", "sensors", "room recordings", "acoustic sources" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
659
Integration - no longer a barrier? [agile business]
Web services will be a critical technology for enabling the 'agile business'
[ "agile business", "Web services", "integration middleware", "Iona", "AMR Research" ]
[ "P", "P", "M", "U", "U" ]
1148
Benchmarking of the Dose Planning Method (DPM) Monte Carlo code using electron beams from a racetrack microtron
A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for dose calculations from 10 and 50 MeV scanned electron beams produced from a racetrack microtron. Central axis depth dose measurements and a series of profile scans at various depths were acquired in a water phantom using a Scanditronix type RK ion chamber. Source spatial distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber measurements carried out across the two-dimensional beam profile at 100 cm downstream from the source. The in-air spatial distributions were found to have full width at half maximum of 4.7 and 1.3 cm, at 100 cm from the source, for the 10 and 50 MeV beams, respectively. Energy spectra for the 10 and 50 MeV beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. DPM calculations are on average within +or-2% agreement with measurement for all depth dose and profile comparisons conducted in this study. The accuracy of the DPM code illustrated in this work suggests that DPM may be used as a valuable tool for electron beam dose calculations
[ "benchmarking", "racetrack microtron", "50 MeV", "scanned electron beams", "central axis depth dose measurements", "profile scans", "water phantom", "ion chamber", "source spatial distributions", "two-dimensional beam profile", "in-air spatial distributions", "electron beam dose calculations", "dose planning method Monte Carlo code", "MCNP4B", "radiotherapy treatment planning", "electron transport", "scoring parameters", "10 MeV" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "M", "M", "U", "R" ]
558
OS porting and application development for SoC
To deliver improved usability in high-end portable consumer products, the use of an appropriate consumer operating system (OS) is becoming far more widespread. Using a commercially supported OS also vastly increases the availability of supported applications. For the device developer, this trend adds major complexity to the problem of system implementation. Porting a complete operating system to a new hardware design adds significantly to the development burden, increasing both time-to-market and expense. Even for those familiar with the integration of a real-time OS, the porting, validation and support of a complex platform OS is a formidable task
[ "OS porting", "application development", "consumer operating system", "hardware design" ]
[ "P", "P", "P", "P" ]
892
Dementing disorders: volumetric measurement of cerebrospinal fluid to distinguish normal from pathologic finding - feasibility study
We have demonstrated that automated methods to describe the severity and distribution of cerebral atrophy are capable of providing diagnostic information in the classification of neurodegenerative diseases
[ "dementing disorders", "automated methods", "diagnostic information", "cerebrospinal fluid volumetric measurement", "magnetic resonance imaging technique", "medical diagnostic imaging", "healthy subjects", "normal-pathologic findings distinguishing", "neurodegenerative diseases classification", "cerebral atrophy distribution", "cerebral atrophy severity" ]
[ "P", "P", "P", "R", "U", "M", "U", "M", "R", "R", "R" ]
1010
Robust self-tuning PID controller for nonlinear systems
In this paper, we propose a robust self-tuning PID controller suitable for nonlinear systems. The control system employs a preload relay (P_Relay) in series with a PID controller. The P_Relay ensures a high gain to yield a robust performance. However, it also incurs a chattering phenomenon. In this paper, instead of viewing the chattering as an undesirable yet inevitable feature, we use it as a naturally occurring signal for tuning and re-tuning the PID controller as the operating regime digresses. No other explicit input signal is required. Once the PID controller is tuned for a particular operating point, the relay may be disabled and chattering ceases correspondingly. However, it is invoked when there is a change in setpoint to another operating regime. In this way, the approach is also applicable to time-varying systems as the PID tuning can be continuous, based on the latest set of chattering characteristics. Analysis is provided on the stability properties of the control scheme. Simulation results for the level control of fluid in a spherical tank using the scheme are also presented
[ "robust self-tuning PID controller", "nonlinear systems", "preload relay", "robust performance", "chattering phenomenon", "naturally occurring signal", "operating regime", "time-varying systems", "stability properties", "simulation results", "spherical tank", "controller tuning", "controller re-tuning", "relay disabling", "continuous tuning", "fluid level control" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "R" ]
1055
A re-examination of probability matching and rational choice
In a typical probability learning task participants are presented with a repeated choice between two response alternatives, one of which has a higher payoff probability than the other. Rational choice theory requires that participants should eventually allocate all their responses to the high-payoff alternative, but previous research has found that people fail to maximize their payoffs. Instead, it is commonly observed that people match their response probabilities to the payoff probabilities. We report three experiments on this choice anomaly using a simple probability learning task in which participants were provided with (i) large financial incentives, (ii) meaningful and regular feedback, and (iii) extensive training. In each experiment large proportions of participants adopted the optimal response strategy and all three of the factors mentioned above contributed to this. The results are supportive of rational choice theory
[ "probability matching", "rationality", "probability learning task", "payoff probability", "rational choice theory", "response probabilities", "choice anomaly", "large financial incentives", "feedback", "extensive training", "optimal response strategy", "meaningful regular feedback" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
701
High dynamic control of a three-level voltage-source-converter drive for a main strip mill
A high dynamic control system for the Alspa VDM 7000 medium-voltage drive was implemented, which provides fast torque response times of a few milliseconds despite the typically low switching frequency of gate-turn-off thyristors which is necessary to achieve high efficiency. The drive system consists of a three-level voltage-source converter with active front end and a synchronous motor. The drive has most recently been applied to a main strip mill. It provides a maximum of 8.3-MW mechanical power with a rated motor voltage of 3 kV. Besides motor torque as the main control objective, the control system has to comply with a number of additional objectives and constraints like DC-link voltage regulation and balancing, current and torque harmonics, motor flux, and excitation
[ "strip mill", "high dynamic control system", "medium-voltage drive", "switching frequency", "gate-turn-off thyristors", "efficiency", "three-level voltage-source converter", "synchronous motor", "mechanical power", "motor voltage", "control objective", "DC-link voltage regulation", "torque harmonics", "motor flux", "excitation", "DC-link voltage balancing", "current harmonics", "8.3 MW", "3 kV" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U", "M" ]
744
A virtual victory [virtual networks]
Newly fashionable virtual network operators look all set to clean up in the corporate sector
[ "virtual network operators", "corporate sector" ]
[ "P", "P" ]
1311
Blended implementation of block implicit methods for ODEs
In this paper we further develop a new approach for naturally defining the nonlinear splittings needed for the implementation of block implicit methods for ODEs, which has been considered by Brugnano [J. Comput. Appl. Math. 116 (2000) 41] and by Brugnano and Trigiante [in: Recent Trends in Numerical Analysis, Nova Science, New York, 2000, pp. 81-105]. The basic idea is that of defining the numerical method as the combination (blending) of two suitable component methods. By carefully choosing such methods, it is shown that very efficient implementations can be obtained. Moreover, some of them, characterized by a diagonal splitting, are well suited for parallel computers. Numerical tests comparing the performances of the proposed implementation with existing ones are also presented, in order to make evident the potential of the approach
[ "blended implementation", "block implicit methods", "ODEs", "nonlinear splittings", "numerical method", "diagonal splitting", "parallel computers", "numerical tests" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1354
Design and analysis of optimal material distribution policies in flexible manufacturing systems using a single AGV
Modern automated manufacturing processes employ automated guided vehicles (AGVs) for material handling, which serve several machine centres (MC) in a factory. Optimal scheduling of AGVs can significantly help to increase the efficiency of the manufacturing process by minimizing the idle time of MCs waiting for the raw materials. We analyse the requirements for an optimal schedule and then provide a mathematical framework for an efficient schedule of material delivery by an AGV. A mathematical model is developed and then a strategy for optimal material distribution of the available raw material to the MCs is derived. With this model, the optimal number of MCs to be utilized is also determined. Finally, the material delivery schedule employing multiple journeys to the MCs by the AGV is carried out. Through rigorous analysis and simulation experiments, we show that such a delivery strategy will optimize the overall performance
[ "optimal material distribution policies", "flexible manufacturing systems", "AGV", "automated guided vehicle", "material handling", "machine centres", "optimal scheduling", "material delivery", "waiting time", "manufacturing lead time", "idle time minimization" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R" ]
1095
Development of a real-time monitoring system
This paper describes a pattern recognition (PR) technique, which uses learning vector quantization (LVQ). This method is adapted for practical application to solve problems in the area of condition monitoring and fault diagnosis where a number of fault signatures are involved. In these situations, the aim is health monitoring, including identification of deterioration of the healthy condition and identification of causes of the failure in real-time. For this reason a fault database is developed which contains the collected information about various states of operation of the system in the form of pattern vectors. The task of the real-time monitoring system is to correlate patterns of unknown faults with the known fault signatures in the fault database. This will determine cause of failure and degree of deterioration of the system under test. The problem of fault diagnosis may involve a large number of patterns and large sampling time, which affects the learning stage of neural networks. The study here also aims to find a fast learning model of neural networks for instances when a high number of patterns and numerous processing elements are involved. It begins searching for an appropriate solution. The study is extended to the enforcement learning models and considers LVQ as a network emerged from the competitive learning model through enforcement training. Finally, tests show an accuracy of 92.3 per cent in the fault diagnostic capability of the technique
[ "real-time monitoring system", "learning vector quantization", "LVQ", "condition monitoring", "fault diagnosis", "fault signatures", "health monitoring", "fault database", "pattern vectors", "large sampling time", "neural networks", "fast learning model", "competitive learning model", "enforcement training", "fault diagnostic capability", "pattern recognition technique", "deterioration identification", "real-time failure cause identification", "pattern correlation", "CNC machine centre", "coolant system" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "U", "M" ]
1444
Adaptable dialog boxes for cross-platform programming
The author presents a framework for building dialog boxes that adapt to the look and feel of their platform. This method also helps with a few related problems: specifying cross-platform resources and handling dialog size changes due to localization. He uses a combination of XML, automatic layout, and run-time dialog creation to give you most of the benefits of platform-specific resources, without the associated pain. Source code with an implementation of the layout engine for Mac OS 9.1 ("Carbon"), Mac OS X, and Microsoft Windows can be downloaded from the CUJ website at <www.cuj.com/code>. You can use this code as is, or as a starting point for your own more complete implementation
[ "adaptable dialog boxes", "dialog boxes", "cross-platform programming", "cross-platform resources", "dialog size changes", "localization", "XML", "automatic layout", "run-time dialog creation", "platform-specific resources", "Mac OS 9.1", "Mac OS X", "Microsoft Windows" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
852
Building an effective computer science student organization: the Carnegie Mellon Women@SCS action plan
This paper aims to provide a practical guide for building a student organization and designing activities and events that can encourage and support a community of women in computer science. This guide is based on our experience in building Women@SCS, a community of women in the School of Computer Science (SCS) at Carnegie Mellon University. Rather than provide an abstract "to-do" or "must-do" list, we present a sampling of concrete activities and events in the hope that these might suggest possibilities for a likeminded student organization. However, since we have found it essential to have a core group of activist students at the helm, we provide a "to-do" list of features that we feel are essential for forming, supporting and sustaining creative and effective student leadership
[ "computer science student organization", "Women@SCS action plan", "women", "Carnegie Mellon University", "student leadership", "gender issues", "computer science education" ]
[ "P", "P", "P", "P", "P", "U", "M" ]
817
Summarization beyond sentence extraction: A probabilistic approach to sentence compression
When humans produce summaries of documents, they do not simply extract sentences and concatenate them. Rather, they create new sentences that are grammatical, that cohere with one another, and that capture the most salient pieces of information in the original document. Given that large collections of text/abstract pairs are available online, it is now possible to envision algorithms that are trained to mimic this process. In this paper, we focus on sentence compression, a simpler version of this larger challenge. We aim to achieve two goals simultaneously: our compressions should be grammatical, and they should retain the most important pieces of information. These two goals can conflict. We devise both a noisy-channel and a decision-tree approach to the problem, and we evaluate results against manual compressions and a simple baseline
[ "sentence compression", "grammatical", "noisy-channel", "decision-tree", "document summarization" ]
[ "P", "P", "P", "P", "R" ]
779
Domesticating computers and the Internet
The people who use computers and the ways they use them have changed substantially over the past 25 years. In the beginning highly educated people, mostly men, in technical professions used computers for work, but over time a much broader range of people are using computers for personal and domestic purposes. This trend is still continuing, and over a shorter time scale has been replicated with the use of the Internet. The paper uses data from four national surveys to document how personal computers and the Internet have become increasingly domesticated since 1995 and to explore the mechanisms for this shift. Now people log on more often from home than from places of employment and do so for pleasure and for personal purposes rather than for their jobs. Analyses comparing veteran Internet users to novices in 1998 and 2000 and analyses comparing the change in use within a single sample between 1995 and 1996 support two complementary explanations for how these technologies have become domesticated. Women, children, and less well-educated individuals are increasingly using computers and the Internet and have a more personal set of motives than well-educated men. In addition, the widespread diffusion of the PC and the Internet and the response of the computing industry to the diversity in consumers has led to a rich set of personal and domestic services
[ "Internet", "highly educated people", "technical professions", "domestic purposes", "national surveys", "personal computers", "veteran Internet users", "novices", "women", "children", "computing industry", "domestic services", "computer domestication", "personal usage", "personal motives", "PC diffusion", "demographics", "online behavior" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "R", "R", "U", "U" ]
1369
Use of Bayesian Belief Networks when combining disparate sources of information in the safety assessment of software-based systems
The paper discusses how disparate sources of information can be combined in the safety assessment of software-based systems. The emphasis is put on an emerging methodology, relevant for intelligent product-support systems, to combine information about disparate evidences systematically based on Bayesian Belief Networks. The objective is to show the link between basic information and the confidence one can have in a system. How one combines the Bayesian Belief Net (BBN) method with a software safety standard (RTCA/DO-178B,) for safety assessment of software-based systems is also discussed. Finally, the applicability of the BBN methodology and experiences from cooperative research work together with Kongsberg Defence & Aerospace and Det Norske Veritas, and ongoing research with VTT Automation are presented
[ "Bayesian belief networks", "safety assessment", "software-based systems", "intelligent product-support systems", "software safety standard" ]
[ "P", "P", "P", "P", "P" ]
1394
Subject access to government documents in an era of globalization: intellectual bundling of entities affected by the decisions of supranational organizations
As a result of the growing influence of supranational organizations, there is a need for a new model for subject access to government information in academic libraries. Rulings made by supranational bodies such as the World Trade Organization (WTO) and rulings determined under the auspices of transnational economic agreements such as the North American Free Trade Agreement (NAFTA) often supersede existing law, resulting in obligatory changes to national, provincial, state, and municipal legislation. Just as important is the relationship among private sector companies, third party actors such as nongovernmental organizations (NGOs), and governments. The interaction among the various entities affected by supranational rulings could potentially form the basis of a new model for subject access to government information
[ "government documents", "globalization", "intellectual bundling", "supranational organizations", "academic libraries", "World Trade Organization", "transnational economic agreements", "North American Free Trade Agreement", "municipal legislation", "national legislation", "provincial legislation", "state legislation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
784
Where tech is cheap [servers]
Talk, consultancy, support, not tech is the expensive part of network installations. It's a good job that small-scale servers can either be remotely managed, or require little actual management
[ "network installations", "small-scale servers", "management" ]
[ "P", "P", "P" ]
1068
Quantum phase gate for photonic qubits using only beam splitters and postselection
We show that a beam splitter of reflectivity one-third can be used to realize a quantum phase gate operation if only the outputs conserving the number of photons on each side are postselected
[ "quantum phase gate", "photonic qubits", "postselection", "reflectivity", "quantum phase gate operation", "outputs", "multiqubit networks", "postselected quantum gate", "optical quantum gate operations", "photon number conservation", "postselected photon number conserving outputs", "quantum computation", "quantum information processing", "postselected quantum phase gate", "polarization beam splitters" ]
[ "P", "P", "P", "P", "P", "P", "U", "R", "M", "R", "R", "M", "M", "R", "M" ]
699
Novel line conditioner with voltage up/down capability
In this paper, a novel pulsewidth-modulated line conditioner with fast output voltage control is proposed. The line conditioner is made up of an AC chopper with reversible voltage control and a transformer for series voltage compensation. In the AC chopper, a proper switching operation is achieved without the commutation problem. To absorb energy stored in line stray inductance, a regenerative DC snubber can be utilized which has only one capacitor without discharging resistors or complicated regenerative circuit for snubber energy. Therefore, the proposed AC chopper gives high efficiency and reliability. The output voltage of the line conditioner is controlled using a fast sensing technique of the output voltage. It is also shown via some experimental results that the presented line conditioner gives good dynamic and steady-state performance for high quality of the output voltage
[ "pulsewidth-modulated line conditioner", "output voltage control", "AC chopper", "reversible voltage control", "switching operation", "commutation", "line stray inductance", "regenerative DC snubber", "steady-state performance", "series voltage compensation transformer", "dynamic performance" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
1289
Combining PC control and HMI
Integrating PC-based control with human machine interface (HMI) technology can benefit a plant floor system. However, before one decides on PC-based control, there are many things one should consider, especially when using a soft programmable logic controller (PLC) to command the input/output. There are three strategies to integrate a PC-based control system with an HMI: treat the PC running the control application as if it were a PLC, integrate the system using standard PC interfaces; or using application programming interfaces
[ "human machine interface", "programmable logic controller", "PC-based control system", "PC interfaces", "application programming interfaces", "shop floor system" ]
[ "P", "P", "P", "P", "P", "M" ]
1175
Prediction of tool and chip temperature in continuous and interrupted machining
A numerical model based on the finite difference method is presented to predict tool and chip temperature fields in continuous machining and time varying milling processes. Continuous or steady state machining operations like orthogonal cutting are studied by modeling the heat transfer between the tool and chip at the tool-rake face contact zone. The shear energy created in the primary zone, the friction energy produced at the rake face-chip contact zone and the heat balance between the moving chip and stationary tool are considered. The temperature distribution is solved using the finite difference method. Later, the model is extended to milling where the cutting is interrupted and the chip thickness varies with time. The proposed model combines the steady-state temperature prediction in continuous machining with transient temperature evaluation in interrupted cutting operations where the chip and the process change in a discontinuous manner. The mathematical models and simulation results are in satisfactory agreement with experimental temperature measurements reported in the literature
[ "interrupted machining", "numerical model", "finite difference method", "continuous machining", "time varying milling processes", "orthogonal cutting", "heat transfer", "tool-rake face contact zone", "shear energy", "primary zone", "friction energy", "temperature distribution", "tool temperature prediction", "chip temperature prediction", "first-order dynamic system", "thermal properties" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U", "U" ]
1130
Node-capacitated ring routing
We consider the node-capacitated routing problem in an undirected ring network along with its fractional relaxation, the node-capacitated multicommodity flow problem. For the feasibility problem, Farkas' lemma provides a characterization for general undirected graphs, asserting roughly that there exists such a flow if and only if the so-called distance inequality holds for every choice of distance functions arising from nonnegative node weights. For rings, this (straightforward) result will be improved in two ways. We prove that, independent of the integrality of node capacities, it suffices to require the distance inequality only for distances arising from (0-1-2)-valued node weights, a requirement that will be called the double-cut condition. Moreover, for integer-valued node capacities, the double-cut condition implies the existence of a half-integral multicommodity flow. In this case there is even an integer-valued multicommodity flow that violates each node capacity by at most one. Our approach gives rise to a combinatorial, strongly polynomial algorithm to compute either a violating double-cut or a node-capacitated multicommodity flow. A relation of the problem to its edge-capacitated counterpart will also be explained
[ "node-capacitated ring routing", "node-capacitated routing problem", "undirected ring network", "fractional relaxation", "node-capacitated multicommodity flow problem", "feasibility problem", "undirected graphs", "distance inequality", "distance functions", "nonnegative node weights", "double-cut condition", "integer-valued node capacities", "half-integral multicommodity flow", "integer-valued multicommodity flow", "violating double-cut", "Farkas lemma", "node capacity integrality", "combinatorial strongly polynomial algorithm", "edge-cut criterion" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "U" ]
565
Control of thin film growth in chemical vapor deposition manufacturing systems: a feasibility study
A study is carried out to design and optimize chemical vapor deposition (CVD) systems for material fabrication. Design and optimization of the CVD process is necessary to satisfying strong global demand and ever increasing quality requirements for thin film production. Advantages of computer aided optimization include high design turnaround time, flexibility to explore a larger design space and the development and adaptation of automation techniques for design and optimization. A CVD reactor consisting of a vertical impinging jet at atmospheric pressure, for growing titanium nitride films, is studied for thin film deposition. Numerical modeling and simulation are used to determine the rate of deposition and film uniformity over a wide range of design variables and operating conditions. These results are used for system design and optimization. The optimization procedure employs an objective function characterizing film quality, productivity and operational costs based on reactor gas flow rate, susceptor temperature and precursor concentration. Parameter space mappings are used to determine the design space, while a minimization algorithm, such as the steepest descent method, is used to determine optimal operating conditions for the system. The main features of computer aided design and optimization using these techniques are discussed in detail
[ "thin film growth", "chemical vapor deposition", "optimization", "material fabrication", "titanium nitride films", "film quality", "operational costs", "reactor gas flow rate", "susceptor temperature", "precursor concentration", "parameter space mappings", "TiN" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
598
From FREE to FEE [online advertising market]
As the online advertising market continues to struggle, many online content marketers are wrestling with the issue of how to add at least some level of paid subscription income to their revenue mix in order to reach or improve profitability. Since the business of selling content online is still in its infancy, and many consumers clearly still think of Web content as simply and rightfully free, few roadmaps are available to show the way to effective marketing strategies, but some guiding principles have emerged
[ "online advertising market", "paid subscription income", "selling content online", "marketing strategies" ]
[ "P", "P", "P", "P" ]
1188
It's time to buy
There is an upside to a down economy: over-zealous suppliers are willing to make deals that were unthinkable a few years ago. That's because vendors are experiencing the same money squeeze as manufacturers, which makes the year 2002 the perfect time to invest in new technology. The author states that when negotiating the deal, provisions for unexpected costs, an exit strategy, and even shared risk with the vendor should be on the table
[ "suppliers", "vendor", "money squeeze", "negotiation", "unexpected costs", "exit strategy", "shared risk", "buyers market", "bargaining power" ]
[ "P", "P", "P", "P", "P", "P", "P", "U", "U" ]
1274
Bounded model checking for the universal fragment of CTL
Bounded Model Checking (BMC) has been recently introduced as an efficient verification method for reactive systems. BMC based on SAT methods consists in searching for a counterexample of a particular length and generating a propositional formula that is satisfiable iff such a counterexample-exists. This new technique has been introduced by E. Clarke et al. for model checking of linear time temporal logic (LTL). Our paper shows how the concept of bounded model checking can be extended to ACTL (the universal fragment of CTL). The implementation of the algorithm for Elementary Net Systems is described together with the experimental results
[ "bounded model checking", "model checking", "universal fragment", "verification method", "reactive systems", "SAT methods", "propositional formula", "linear time temporal logic", "elementary net systems", "bounded semantics" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
1231
Efficient parallel programming on scalable shared memory systems with High Performance Fortran
OpenMP offers a high-level interface for parallel programming on scalable shared memory (SMP) architectures. It provides the user with simple work-sharing directives while it relies on the compiler to generate parallel programs based on thread parallelism. However, the lack of language features for exploiting data locality often results in poor performance since the non-uniform memory access times on scalable SMP machines cannot be neglected. High Performance Fortran (HPF), the de-facto standard for data parallel programming, offers a rich set of data distribution directives in order to exploit data locality, but it has been mainly targeted towards distributed memory machines. In this paper we describe an optimized execution model for HPF programs on SMP machines that avails itself with mechanisms provided by OpenMP for work sharing and thread parallelism, while exploiting data locality based on user-specified distribution directives. Data locality does not only ensure that most memory accesses are close to the executing threads and are therefore faster, but it also minimizes synchronization overheads, especially in the case of unstructured reductions. The proposed shared memory execution model for HPF relies on a small set of language extensions, which resemble the OpenMP work-sharing features. These extensions, together with an optimized shared memory parallelization and execution model, have been implemented in the ADAPTOR HPF compilation system and experimental results verify the efficiency of the chosen approach
[ "parallel programming", "scalable shared memory", "High Performance Fortran", "multiprocessor architectures", "scalable hardware", "shared memory multiprocessor" ]
[ "P", "P", "P", "M", "M", "M" ]
664
The agile revolution [business agility]
There is a new business revolution in the air. The theory is there, the technology is evolving, fast. It is all about agility
[ "business agility", "software design", "software deployment", "organisational structures", "supply chains" ]
[ "P", "U", "U", "U", "U" ]
621
MPEG-4 video object-based rate allocation with variable temporal rates
In object-based coding, bit allocation is performed at the object level and temporal rates of different objects may vary. The proposed algorithm deals with these two issues when coding multiple video objects (MVOs). The proposed algorithm is able to successfully achieve the target bit rate, effectively code arbitrarily shaped MVOs with different temporal rates, and maintain a stable buffer level
[ "object-based rate allocation", "variable temporal rates", "bit allocation", "multiple video objects", "MPEG-4 video coding", "rate-distortion encoding" ]
[ "P", "P", "P", "P", "R", "U" ]
1438
Three-dimensional particle image tracking for dilute particle-liquid flows in a pipe
A three-dimensional (3D) particle image tracking technique was used to study the coarse spherical particle-liquid flows in a pipe. The flow images from both the front view and the normal side view, which was reflected into the front view by a mirror, were recorded with a CCD camera and digitized by a PC with an image grabber card. An image processing program was developed to enhance and segment the flow image, and then to identify the particles. Over 90% of all the particles can be identified and located from the partially overlapped particle images using the circular Hough transform. Then the 3D position of each detected particle was determined by matching its front view image to its side view image. The particle velocity was then obtained by pairing its images in successive video fields. The measurements for the spherical expanded polystyrene particle-oil flows show that the particles, like the spherical bubbles in laminar bubbly flows, tend to conglomerate near the pipe wall and to line up to form the particle clusters. As liquid velocity decreases, the particle clusters disperse and more particles are distributed in the pipe centre region
[ "three-dimensional particle image tracking", "dilute particle-liquid flows", "CCD camera", "Hough transform", "3D position", "spherical bubble", "particle clusters", "two-phase flow", "pipe flow", "stereo-imaging technique", "phase distribution", "spherical expanded polystyrene particle", "Wiener filter", "image segmentation", "region growing technique", "image recognition", "image matching" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "R", "M", "M", "R", "U", "R", "M", "M", "R" ]
705
Use of extra degrees of freedom in multilevel drives
Multilevel converters with series connection of semiconductors allow power electronics to reach medium voltages (1-10 kV) with relatively standard components. The increase of the number of semiconductors provides extra degrees of freedom, which can be used to improve different characteristics. This paper is focused on variable-speed drives and it is shown that with the proposed multilevel direct torque control strategy (DiCoIF) the tradeoff between the performances of the drive (harmonic distortions, torque dynamics, voltage step gradients, etc.) and the switching frequency of the semiconductors is improved. Then, a slightly modified strategy reducing common-mode voltage and bearing currents is presented
[ "degrees of freedom", "multilevel drives", "series connection", "semiconductors", "power electronics", "medium voltages", "variable-speed drives", "multilevel direct torque control strategy", "harmonic distortions", "torque dynamics", "voltage step gradients", "switching frequency", "bearing currents", "common-mode voltage reduction", "delay estimation", "industrial power systems", "insulated gate bipolar transistors", "state estimation", "fixed-frequency dynamic control", "1 to 10 kV" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "U", "M", "U", "U", "M", "R" ]
740
The Malaysian model
Japan's first third generation service, Foma, is unlikely to be truly attractive to consumers until 2005. That still falls well within the financial planning of its operator Docomo. But where does that leave European 3G operators looking for reassurance? Malaysia, says Simon Marshall
[ "Malaysia", "3G operators", "Maxis Communications", "Telekom Malaysia" ]
[ "P", "P", "U", "M" ]
1315
Traffic engineering with traditional IP routing protocols
Traffic engineering involves adapting the routing of traffic to network conditions, with the joint goals of good user performance and efficient use of network resources. We describe an approach to intradomain traffic engineering that works within the existing deployed base of interior gateway protocols, such as Open Shortest Path First and Intermediate System-Intermediate System. We explain how to adapt the configuration of link weights, based on a networkwide view of the traffic and topology within a domain. In addition, we summarize the results of several studies of techniques for optimizing OSPF/IS-IS weights to the prevailing traffic. The article argues that traditional shortest path routing protocols are surprisingly effective for engineering the flow of traffic in large IP networks
[ "IP routing protocols", "network conditions", "user performance", "network resources", "intradomain traffic engineering", "interior gateway protocols", "OSPF/IS-IS weights", "shortest path routing protocols", "IP networks", "link weights configuration", "traffic routing", "network topology", "TCP", "transmission control protocol", "Open Shortest Path First protocol", "Intermediate System-Intermediate System protocol" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "U", "M", "R", "R" ]
1350
Generalized mosaicing: wide field of view multispectral imaging
We present an approach to significantly enhance the spectral resolution of imaging systems by generalizing image mosaicing. A filter transmitting spatially varying spectral bands is rigidly attached to a camera. As the system moves, it senses each scene point multiple times, each time in a different spectral band. This is an additional dimension of the generalized mosaic paradigm, which has demonstrated yielding high radiometric dynamic range images in a wide field of view, using a spatially varying density filter. The resulting mosaic represents the spectrum at each scene point. The image acquisition is as easy as in traditional image mosaics. We derive an efficient scene sampling rate, and use a registration method that accommodates the spatially varying properties of the filter. Using the data acquired by this method, we demonstrate scene rendering under different simulated illumination spectra. We are also able to infer information about the scene illumination. The approach was tested using a standard 8-bit black/white video camera and a fixed spatially varying spectral (interference) filter
[ "generalized mosaicing", "wide field of view multispectral imaging", "spatially varying spectral bands", "spatially varying density filter", "image acquisition", "scene sampling rate", "registration method", "scene rendering", "simulated illumination spectra", "scene illumination", "hyperspectral imaging", "color balance", "image fusion", "physics-based vision", "image-based rendering" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "U", "M", "U", "M" ]
896
Calculation of the probability of survival of an insurance company with allowance for the rate of return for a Poisson stream of premiums
The probability of survival of an insurance company with the working capital is calculated for a Poisson stream of premiums
[ "insurance company", "survival probability", "return rate", "Poisson premium stream", "probability density function" ]
[ "P", "R", "R", "R", "M" ]
1014
Modelling of complete robot dynamics based on a multi-dimensional, RBF-like neural architecture
A neural network based identification approach of manipulator dynamics is presented. For a structured modelling, RBF-like static neural networks are used in order to represent and adapt all model parameters with their non-linear dependences on the joint positions. The neural architecture is hierarchically organised to reach optimal adjustment to structural a priori-knowledge about the identification problem. The model structure is substantially simplified by general system analysis independent of robot type. But also a lot of specific features of the utilised experimental robot are taken into account. A fixed, grid based neuron placement together with application of B-spline polynomial basis functions is utilised favourably for a very effective recursive implementation of the neural architecture. Thus, an online identification of a dynamic model is submitted for a complete 6 joint industrial robot
[ "complete robot dynamics", "neural architecture", "manipulator dynamics", "static neural networks", "general system analysis", "B-spline polynomial basis functions", "recursive implementation", "online identification", "dynamic model", "complete 6 joint industrial robot", "multi-dimensional RBF-like neural architecture", "fixed grid based neuron placement", "online learning" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M" ]
1051
Faking it: simulating dependent types in Haskell
Dependent types reflect the fact that validity of data is often a relative notion by allowing prior data to affect the types of subsequent data. Not only does this make for a precise type system, but also a highly generic one: both the type and the program for each instance of a family of operations can be computed from the data which codes for that instance. Recent experimental extensions to the Haskell type class mechanism give us strong tools to relativize types to other types. We may simulate some aspects of dependent typing by making counterfeit type-level copies of data, with type constructors simulating data constructors and type classes simulating datatypes. This paper gives examples of the technique and discusses its potential
[ "dependent types", "dependent types", "Haskell", "precise type system", "type class mechanism", "counterfeit type-level copies", "type constructors", "data constructors", "datatypes", "data validity", "dependent typing", "functional programming" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "M" ]
1109
The existence condition of gamma -acyclic database schemes with MVDs constraints
It is very important to use database technology for a large-scale system such as ERP and MIS. A good database design may improve the performance of the system. Some research shows that a gamma -acyclic database scheme has many good properties, e.g., each connected join expression is monotonous, which helps to improve query performance of the database system. Thus what conditions are needed to generate a gamma -acyclic database scheme for a given relational scheme? In this paper, the sufficient and necessary condition of the existence of gamma -acyclic, join-lossless and dependencies-preserved database schemes meeting 4NF is given
[ "existence condition", "gamma -acyclic database schemes", "MVDs constraints", "database technology", "large-scale system", "connected join expression", "query performance", "sufficient and necessary condition" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
933
Real-time estimations of multi-modal frequencies for smart structures
In this paper, various methods for the real-time estimation of multi-modal frequencies are realized in real time and compared through numerical and experimental tests. These parameter-based frequency estimation methods can be applied to various engineering fields such as communications, radar and adaptive vibration and noise control. Well-known frequency estimation methods are introduced and explained. The Bairstow method is introduced to find the roots of a characteristic equation for estimations of multi-modal frequencies, and the computational efficiency of the Bairstow method is shown quantitatively. For a simple numerical test, we consider two sinusoids of the same amplitudes mixed with various amounts of white noise. The test results show that the auto regressive (AR) and auto regressive and moving average (ARMA) methods are unsuitable in noisy environments. The other methods apart from the AR method have fast tracking capability. From the point of view of computational efficiency, the results reveal that the ARMA method is inefficient, while the cascade notch filter method is very effective. The linearized adaptive notch filter and recursive maximum likelihood methods have average performances. Experimental tests are devised to confirm the feasibility of real-time computations and to impose the severe conditions of drastically different amplitudes and of considerable changes of natural frequencies. We have performed experiments to extract the natural frequencies from the vibration signal of wing-like composite plates in real time. The natural frequencies of the specimen are changed by added masses. Especially, the AR method exhibits a remarkable performance in spite of the severe conditions. This study will be helpful to anyone who needs a frequency estimation algorithm for real-time applications
[ "real-time estimation", "multi-modal frequencies", "smart structures", "frequency estimation", "noise control", "Bairstow method", "characteristic equation", "ARMA", "cascade notch filter", "linearized adaptive notch filter", "recursive maximum likelihood methods", "real-time computations", "vibration signal", "wing-like composite plates", "frequency estimation algorithm", "real-time applications", "adaptive vibration control", "auto regressive and moving average methods" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
976
Completion to involution and semidiscretisations
We discuss the relation between the completion to involution of linear over-determined systems of partial differential equations with constant coefficients and the properties of differential algebraic equations obtained by their semidiscretisation. For a certain class of "weakly over-determined" systems, we show that the differential algebraic equations do not contain hidden constraints, if and only if the original partial differential system is involutive. We also demonstrate how the formal theory can be used to obtain an existence and uniqueness theorem for smooth solutions of strongly hyperbolic systems and to estimate the drift off the constraints, if an underlying equation is numerically solved. Finally, we show for general linear systems how the index of differential algebraic equations obtained by semidiscretisations can be predicted from the result of a completion of the partial differential system
[ "completion", "involution", "semidiscretisations", "linear over-determined systems", "partial differential equations", "constant coefficients", "differential algebraic equations", "uniqueness theorem", "strongly hyperbolic systems", "index", "matrices" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
135
Hysteretic threshold logic and quasi-delay insensitive asynchronous design
We introduce the class of hysteretic linear-threshold (HLT) logic functions as a novel extension of linear threshold logic, and prove their general applicability for constructing state-holding Boolean functions. We then demonstrate a fusion of HLT logic with the quasi-delay insensitive style of asynchronous circuit design, complete with logical design examples. Future research directions are also identified
[ "state-holding Boolean functions", "HLT logic", "quasi-delay insensitive style", "asynchronous circuit design", "logic design", "hysteretic linear-threshold logic functions", "digital logic", "CMOS implementation" ]
[ "P", "P", "P", "P", "P", "R", "M", "U" ]
1208
A Virtual Test Facility for the simulation of dynamic response in materials
The Center for Simulating Dynamic Response of Materials at the California Institute of Technology is constructing a virtual shock physics facility for studying the response of various target materials to very strong shocks. The Virtual Test Facility (VTF) is an end-to-end, fully three-dimensional simulation of the detonation of high explosives (HE), shock wave propagation, solid material response to pressure loading, and compressible turbulence. The VTF largely consists of a parallel fluid solver and a parallel solid mechanics package that are coupled together by the exchange of boundary data. The Eulerian fluid code and Lagrangian solid mechanics model interact via a novel approach based on level sets. The two main computational packages are integrated through the use of Pyre, a problem solving environment written in the Python scripting language. Pyre allows application developers to interchange various computational models and solver packages without recompiling code, and it provides standardized access to several data visualization engines and data input mechanisms. In this paper, we outline the main components of the VTF, discuss their integration via Pyre, and describe some recent accomplishments in large-scale simulation using the VTF
[ "Virtual Test Facility", "virtual shock physics facility", "high explosives", "shock wave propagation", "solid material response", "pressure loading", "compressible turbulence", "parallel fluid solver", "parallel solid mechanics", "Pyre", "problem solving environment", "Python scripting language", "data visualization", "shock physics simulation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
67
Metaschemas for ER, ORM and UML data models: a comparison
This paper provides metaschemas for some of the main database modeling notations used in industry. Two Entity Relationship (ER) notations (Information Engineering and Barker) are examined in detail, as well as Object Role Modeling (ORM) conceptual schema diagrams. The discussion of optionality, cardinality and multiplicity is widened to include Unified Modeling Language (UML) class diagrams. Issues addressed in the metamodel analysis include the normalization impact of non-derived constraints on derived associations, the influence of orthogonality on language transparency, and trade-offs between simplicity and expressibility. To facilitate comparison, the same modeling notation is used to display each metaschema. For this purpose, ORM is used because of its greater expressibility and clarity
[ "metaschemas", "ORM", "UML", "data models", "database modeling notations", "Information Engineering", "Object Role Modeling", "conceptual schema diagrams", "optionality", "cardinality", "multiplicity", "Unified Modeling Language", "class diagrams", "normalization", "orthogonality", "language transparency", "Entity Relationship modeling", "Barker notation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
618
Blind source separation applied to image cryptosystems with dual encryption
Blind source separation (BSS) is explored to add another encryption level besides the existing encryption methods for image cryptosystems. The transmitted images are covered with a noise image by specific mixing before encryption and then recovered through BSS after decryption. Simulation results illustrate the validity of the proposed method
[ "blind source separation", "image cryptosystems", "dual encryption", "transmitted images", "noise image" ]
[ "P", "P", "P", "P", "P" ]
108
Exploiting randomness in quantum information processing
We consider how randomness can be made to play a useful role in quantum information processing-in particular, for decoherence control and the implementation of quantum algorithms. For a two-level system in which the decoherence channel is non-dissipative, we show that decoherence suppression is possible if memory is present in the channel. Random switching between two potentially harmful noise sources can then provide a source of stochastic control. Such random switching can also be used in an advantageous way for the implementation of quantum algorithms
[ "randomness", "quantum information processing", "decoherence control", "quantum algorithms", "two-level system", "random switching", "noise", "stochastic control" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1270
A comparison of different decision algorithms used in volumetric storm cells classification
Decision algorithms useful in classifying meteorological volumetric radar data are discussed. Such data come from the radar decision support system (RDSS) database of Environment Canada and concern summer storms created in this country. Some research groups used the data completed by RDSS for verifying the utility of chosen methods in volumetric storm cells classification. The paper consists of a review of experiments that were made on the data from RDSS database of Environment Canada and presents the quality of particular classifiers. The classification accuracy coefficient is used to express the quality. For five research groups that led their experiments in a similar way it was possible to compare received outputs. Experiments showed that the support vector machine (SVM) method and rough set algorithms which use object oriented reducts for rule generation to classify volumetric storm data perform better than other classifiers
[ "decision algorithms", "volumetric storm cells classification", "meteorological volumetric radar data", "radar decision support system", "summer storms", "classification accuracy", "support vector machine", "rough set algorithms", "object oriented reducts" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1235
Finding performance bugs with the TNO HPF benchmark suite
High-Performance Fortran (HPF) has been designed to provide portable performance on distributed memory machines. An important aspect of portable performance is the behavior of the available HPF compilers. Ideally, a programmer may expect comparable performance between different HPF compilers, given the same program and the same machine. To test the performance portability between compilers, we have designed a special benchmark suite, called the TNO HPF benchmark suite. It consists of a set of HPF programs that test various aspects of efficient parallel code generation. The benchmark suite consists of a number of template programs that are used to generate test programs with different array sizes, alignments, distributions, and iteration spaces. It ranges from very simple assignments to more complex assignments such as triangular iteration spaces, convex iteration spaces, coupled subscripts, and indirection arrays. We have run the TNO HPF benchmark suite on three compilers: the PREPARE prototype compiler, the PGI-HPF compiler, and the GMD Adaptor HPF compiler. Results show performance differences that can be quite large (up to two orders of magnitude for the same test program). Closer inspection reveals that the origin of most of the differences in performance is due to differences in local enumeration and storage of distributed array elements
[ "benchmark suite", "High-Performance Fortran", "portable performance", "distributed memory machines", "HPF compilers", "performance portability", "parallel compilers", "compiler optimizations" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
660
At your service [agile businesses]
Senior software executives from three of the world's leading software companies, and one smaller, entrepreneurial software developer, explain the impact that web services, business process management and integrated application architectures are having on their product development plans, and share their vision of the roles these products will play in creating agile businesses
[ "agile businesses", "software companies", "web services", "business process management", "integrated application architectures" ]
[ "P", "P", "P", "P", "P" ]
625
Identifying multivariate discordant observations: a computer-intensive approach
The problem of identifying multiple outliers in a multivariate normal sample is approached via successive testing using P-values rather than tabled critical values. Caroni and Prescott (Appl. Statist. 41, p.355, 1992) proposed a generalization of the EDR-ESD procedure of Rosner (Technometrics, 25, 1983)). Venter and Viljoen (Comput. Statist. Data Anal. 29, p.261, 1999) introduced a computer intensive method to identify outliers in a univariate outlier situation. We now generalize this method to the multivariate outlier situation and compare this new procedure with that of Caroni and Prescott (Appl. Statist. 4, p.355, 1992)
[ "multivariate discordant observations", "computer-intensive approach", "multiple outliers", "multivariate normal sample", "P-values", "tabled critical values", "univariate outlier", "multivariate outlier", "data analysis", "EDR-EHD procedure", "stepwise testing approach" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "M" ]
1171
Manufacturing data analysis of machine tool errors within a contemporary small manufacturing enterprise
The main focus of the paper is directed at the determination of manufacturing errors within the contemporary smaller manufacturing enterprise sector. The manufacturing error diagnosis is achieved through the manufacturing data analysis of the results obtained from the inspection of the component on a co-ordinate measuring machine. This manufacturing data analysis activity adopts a feature-based approach and is conducted through the application of a forward chaining expert system, called the product data analysis distributed diagnostic expert system, which forms part of a larger prototype feedback system entitled the production data analysis framework. The paper introduces the manufacturing error categorisations that are associated with milling type operations, knowledge acquisition and representation, conceptual structure and operating procedure of the prototype manufacturing data analysis facility. The paper concludes with a brief evaluation of the logic employed through the simulation of manufacturing error scenarios. This prototype manufacturing data analysis expert system provides a valuable aid for the rapid diagnosis and elimination of manufacturing errors on a 3-axis vertical machining centre in an environment where operator expertise is limited
[ "manufacturing data analysis", "machine tool errors", "contemporary small manufacturing enterprise", "inspection", "co-ordinate measuring machine", "feature-based approach", "forward chaining expert system", "product data analysis distributed diagnostic expert system", "milling type operations", "knowledge acquisition", "conceptual structure", "operating procedure", "3-axis vertical machining centre", "fixturing errors", "programming errors", "2 1/2D components", "knowledge representation" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "M", "R" ]
1134
Relationship between strong monotonicity property, P/sub 2/-property, and the GUS-property in semidefinite linear complementarity problems
In a recent paper on semidefinite linear complementarity problems, Gowda and Song (2000) introduced and studied the P-property, P/sub 2/-property, GUS-property, and strong monotonicity property for linear transformation L: S/sup n/ to S/sup n/, where S/sup n/ is the space of all symmetric and real n * n matrices. In an attempt to characterize the P/sub 2/-property, they raised the following two questions: (i) Does the strong monotonicity imply the P/sub 2/-property? (ii) Does the GUS-property imply the P/sub 2/-property? In this paper, we show that the strong monotonicity property implies the P/sub 2/-property for any linear transformation and describe an equivalence between these two properties for Lyapunov and other transformations. We show by means of an example that the GUS-property need not imply the P/sub 2/-property, even for Lyapunov transformations
[ "strong monotonicity property", "P/sub 2/-property", "GUS-property", "semidefinite linear complementarity problems", "linear transformation", "Lyapunov transformations", "symmetric real matrices" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
561
SubSeven's Honey Pot program
A serious security threat today are malicious executables, especially new, unseen malicious executables often arriving as email attachments. These new malicious executables are created at the rate of thousands every year and pose a serious threat. Current anti-virus systems attempt to detect these new malicious programs with heuristics generated by hand. This approach is costly and often ineffective. We introduce the Trojan Horse SubSeven, its capabilities and influence over intrusion detection systems. A Honey Pot program is implemented, simulating the SubSeven Server. The Honey Pot Program provides feedback and stores data to and from the SubSeven's client
[ "SubSeven", "Honey Pot program", "security threat", "malicious executables", "email attachments", "anti-virus systems", "Trojan Horse", "intrusion detection systems" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1390
Data quality - unlocking the ROI in CRM
While many organisations realise their most valuable asset is their customers, many more fail to realise the importance of auditing, maintaining and updating the information contained in their customer databases. Today's growing awareness in the importance of data quality in relation to CRM and ROI will help change this attitude. In response, CRM vendors will follow suit and begin to differentiate themselves by offering data quality as part of an enterprise-wide data management methodology
[ "CRM", "customer databases", "data management", "customer relationships", "return on investment" ]
[ "P", "P", "P", "M", "U" ]
780
Failures and successes: notes on the development of electronic cash
Between 1997 and 2001, two mid-sized communities in Canada hosted North America's most comprehensive experiment to introduce electronic cash and, in the process, replace physical cash for casual, low-value payments. The technology used was Mondex, and its implementation was supported by all the country's major banks. It was launched with an extensive publicity campaign to promote Mondex not only in the domestic but also in the global market, for which the Canadian implementation was to serve as a "showcase." However, soon after the start of the first field test it became apparent that the new technology did not work smoothly. On the contrary, it created a host of controversies, in areas as varied as computer security, consumer privacy, and monetary policy. In the following years, few of these controversies could be resolved and Mondex could not be established as a widely used payment mechanism. In 2001, the experiment was finally terminated. Using the concepts developed in recent science and technology studies (STS), the article analyzes these controversies as resulting from the difficulties of fitting electronic cash, a new sociotechnical system, into the complex setting of the existing payment system. The story of Mondex not only offers lessons on why technologies fail, but also offers insight into how short-term failures can contribute to long-term transformations. This suggests the need to rethink the dichotomy of success and failure
[ "electronic cash", "Canada", "low-value payments", "Mondex", "major banks", "publicity campaign", "global market", "Canadian implementation", "computer security", "consumer privacy", "monetary policy", "payment mechanism", "science and technology studies", "sociotechnical system", "short-term failures", "long-term transformations" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1029
Effect of insulation layer on transcribability and birefringence distribution in optical disk substrate
As the need for information storage media with high storage density increases, digital video disks (DVDs) with smaller recording marks and thinner optical disk substrates than those of conventional DVDs are being required. Therefore, improving the replication quality of land-groove or pit structure and reducing the birefringence distribution are emerging as important criteria in the fabrication of high-density optical disk substrates. We control the transcribability and distribution of birefringence by inserting an insulation layer under the stamper during injection-compression molding of DVD RAM substrates. The effects of the insulation layer on the geometrical and optical properties, such as transcribability and birefringence distribution, are examined experimentally. The inserted insulation layer is found to be very effective in improving the quality of replication and leveling out the first peak of the gapwise birefringence distribution near the mold wall and reducing the average birefringence value, because the insulation layer retarded the growth of the solidified layer
[ "insulation layer", "transcribability", "birefringence distribution", "optical disk substrate", "information storage media", "high storage density", "digital video disks", "smaller recording marks", "thinner optical disk substrates", "replication quality", "land-groove", "pit structure", "fabrication", "stamper", "injection-compression molding", "DVD RAM substrates", "optical properties", "gapwise birefringence distribution", "mold wall", "geometrical properties", "solidified layer growth retardation", "polyimide thermal insulation layer" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "M" ]