diff --git "a/dev.jsonl" "b/dev.jsonl" --- "a/dev.jsonl" +++ "b/dev.jsonl" @@ -1,500 +1,500 @@ -{"id": "1833", "title": "British Standard 7666 as a framework for geocoding land and property information the UK", "abstract": "The article examines the role of British Standard 7666 in the development of a national framework for geocoding land and property information in the United Kingdom. The author assesses how local authorities, and other agencies concerned with property and address datasets, are coping with the introduction of British Standard 7666, and examines the prospects and limitations of this development. British Standard 7666 has four parts, comprising specifications for street gazetteer; land and property gazetteer; addresses; and public rights of way. The organisation coordinating the introduction of British Standard 7666, Improvement and Development Agency (IDeA), is also overseeing the development and maintenance of a National Land and Property Gazetteer (NLPG) based on British Standard 7666. The introduction of the new addressing standard has mainly been prompted by Britain's effort to set up a national cadastral service to replace the obsolescent property registration system currently in place", "keyphrases": ["British Standard 7666", "geocoding", "property information", "land information", "UK", "national framework", "United Kingdom", "local authorities", "address datasets", "property datasets", "street gazetteer", "property gazetteer", "land gazetteer", "addresses", "public rights of way", "Improvement and Development Agency", "IDeA", "National Land and Property Gazetteer", "NLPG", "addressing standard", "national cadastral service", "property registration system", "land information systems"]} -{"id": "1876", "title": "The development of CASC [automated theorem proving]", "abstract": "Researchers who make theoretical advances also need some way to demonstrate that an advance really does have general, overall positive consequences for system performance. For this it is necessary to evaluate the system on a set of problems that is sufficiently large and diverse to be somehow representative of the intended application area as a whole. It is only a small step from system evaluation to a communal system competition. The CADE ATP System Competition (CASC) has been run annually since 1996. Any competition is difficult to design and organize in the first instance, and to then run over the years. In order to obtain the full benefits of a competition, a thoroughly organized event, with an unambiguous and motivated design, is necessary. For some issues relevant to the CASC design, inevitable constraints have emerged. For other issues there have been several choices, and decisions have had to be made. This paper describes the evolution of CASC, paying particular attention to its design, design changes, and organization", "keyphrases": ["CASC", "system performance", "system evaluation", "automated deduction", "CADE ATP System Competition", "automated theorem proving", "AI", "artificial intelligence", "classical first order logic"]} -{"id": "1605", "title": "A GRASP heuristic for the mixed Chinese postman problem", "abstract": "Arc routing problems (ARPs) consist of finding a traversal on a graph satisfying some conditions related to the links of the graph. In the Chinese postman problem (CPP) the aim is to find a minimum cost tour (closed walk) traversing all the links of the graph at least once. Both the Undirected CPP, where all the links are edges that can be traversed in both ways, and the Directed CPP, where all the links are arcs that must be traversed in a specified way, are known to be polynomially solvable. However, if we deal with a mixed graph (having edges and arcs), the problem turns out to be NP-hard. In this paper, we present a heuristic algorithm for this problem, the so-called Mixed CPP (MCPP), based on greedy randomized adaptive search procedure (GRASP) techniques. The algorithm has been tested and compared with other known and recent methods from the literature on a wide collection of randomly generated instances, with up to 200 nodes and 600 links, producing encouraging computational results. As far as we know, this is the best heuristic algorithm for the MCPP, with respect to solution quality, published up to now", "keyphrases": ["mixed Chinese postman problem", "GRASP heuristic", "arc routing problems", "graph traversal", "minimum cost tour", "closed walk", "NP-hard problem", "heuristic algorithm", "greedy randomized adaptive search procedure", "optimization problems", "metaheuristics"]} -{"id": "1640", "title": "Integration is LIMS inspiration", "abstract": "For software manufacturers, blessings come in the form of fast-moving application areas. In the case of LIMS, biotechnology is still in the driving seat, inspiring developers to maintain consistently rapid and creative levels of innovation. Current advancements are no exception. Integration and linking initiatives are still popular and much of the activity appears to be coming from a very productive minority", "keyphrases": ["software manufacturers", "LIMS", "biotechnology"]} -{"id": "151", "title": "Extending CTL with actions and real time", "abstract": "In this paper, we present the logic ATCTL, which is intended to be used for model checking models that have been specified in a lightweight version of the Unified Modelling Language (UML). Elsewhere, we have defined a formal semantics for LUML to describe the models. This paper's goal is to give a specification language for properties that fits LUML; LUML includes states, actions and real time. ATCTL extends CTL with concurrent actions and real time. It is based on earlier extensions of CTL by R. De Nicola and F. Vaandrager (ACTL) (1990) and R. Alur et aL (TCTL) (1993). This makes it easier to adapt existing model checkers to ATCTL. To show that we can check properties specified in ATCTL in models specified in LUML, we give a small example using the Kronos model checker", "keyphrases": ["actions", "real time logic", "logic ATCTL", "model checking models", "Unified Modelling Language", "formal semantics", "specification language", "Kronos model checker", "computation tree logic"]} -{"id": "1504", "title": "Designing human-centered distributed information systems", "abstract": "Many computer systems are designed according to engineering and technology principles and are typically difficult to learn and use. The fields of human-computer interaction, interface design, and human factors have made significant contributions to ease of use and are primarily concerned with the interfaces between systems and users, not with the structures that are often more fundamental for designing truly human-centered systems. The emerging paradigm of human-centered computing (HCC)-which has taken many forms-offers a new look at system design. HCC requires more than merely designing an artificial agent to supplement a human agent. The dynamic interactions in a distributed system composed of human and artificial agents-and the context in which the system is situated-are indispensable factors. While we have successfully applied our methodology in designing a prototype of a human-centered intelligent flight-surgeon console at NASA Johnson Space Center, this article presents a methodology for designing human-centered computing systems using electronic medical records (EMR) systems", "keyphrases": ["human-centered distributed information systems design", "distributed cognition", "artificial agents", "human agents", "multiple analysis levels", "human-computer interaction", "interface design", "human factors", "human-centered computing systems", "human-centered intelligent flight surgeon console", "NASA Johnson Space Center", "electronic medical records systems"]} -{"id": "1541", "title": "The AT89C51/52 flash memory programmers", "abstract": "When faced with a plethora of applications to design, it's essential to have a versatile microcontroller in hand. The author describes the AT89C51/52 microcontrollers. To get you started, he'll describe his inexpensive microcontroller programmer", "keyphrases": ["AT89C51/52", "flash memory programmers", "microcontrollers", "device programmer", "microcontroller programmer"]} -{"id": "1680", "title": "Minimizing weighted number of early and tardy jobs with a common due window involving location penalty", "abstract": "Studies a single machine scheduling problem to minimize the weighted number of early and tardy jobs with a common due window. There are n non-preemptive and simultaneously available jobs. Each job will incur an early (tardy) penalty if it is early (tardy) with respect to the common due window under a given schedule. The window size is a given parameter but the window location is a decision variable. The objective of the problem is to find a schedule that minimizes the weighted number of early and tardy jobs and the location penalty. We show that the problem is NP-complete in the ordinary sense and develop a dynamic programming based pseudo-polynomial algorithm. We conduct computational experiments, the results of which show that the performance of the dynamic algorithm is very good in terms of memory requirement and CPU time. We also provide polynomial time algorithms for two special cases", "keyphrases": ["early jobs", "tardy jobs", "common due window", "single machine scheduling problem", "decision variable", "location penalty", "NP-complete problem", "dynamic programming", "pseudo-polynomial algorithm"]} -{"id": "1539", "title": "Comments on some recent methods for the simultaneous determination of polynomial zeros", "abstract": "In this note we give some comments on the recent results concerning a simultaneous method of the fourth-order for finding complex zeros in circular interval arithmetic. The main discussion is directed to a rediscovered iterative formula and its modification, presented recently in Sun and Kosmol, (2001). The presented comments include some critical parts of the papers Petkovic, Trickovic, Herceg, (1998) and Sun and Kosmol, (2001) which treat the same subject", "keyphrases": ["polynomial", "zeros", "complex zeros", "circular interval arithmetic", "iterative formula"]} -{"id": "1913", "title": "A six-degree-of-freedom precision motion stage", "abstract": "This article presents the design and performance evaluation of a six-degree-of-freedom piezoelectrically actuated fine motion stage that will be used for three dimensional error compensation of a long-range translation mechanism. Development of a single element, piezoelectric linear displacement actuator capable of translations of 1.67 mu m with 900 V potential across the electrodes and under a 27.4 N axial load and 0.5 mm lateral distortion is presented. Finite element methods have been developed and used to evaluate resonant frequencies of the stage platform and the complete assembly with and without a platform payload. In general, an error of approximately 10.0% between the finite element results and the experimentally measured values were observed. The complete fine motion stage provided approximately +or-0.93 mu m of translation and +or-38.0 mu rad of rotation in all three planes of motion using an excitation range of 1000 V. An impulse response indicating a fundamental mode resonance at 162 Hz was measured with a 0.650 kg payload rigidly mounted to the top of the stage", "keyphrases": ["six-degree-of-freedom precision motion stage", "performance evaluation", "design", "piezoelectrically actuated fine motion stage", "3D error compensation", "long-range translation mechanism", "single element piezoelectric linear displacement actuator", "finite element methods", "resonant frequency", "stage platform", "platform payload", "impulse response", "fundamental mode resonance", "1.67 micron", "900 V", "1000 V", "0.93 to -0.93 micron", "162 Hz", "650.0 gm"]} -{"id": "191", "title": "Linear, parameter-varying control and its application to a turbofan engine", "abstract": "This paper describes application of parameter-dependent control design methods to a turbofan engine. Parameter-dependent systems are linear systems, whose state-space descriptions are known functions of time-varying parameters. The time variation of each of the parameters is not known in advance, but is assumed to be measurable in real-time. Three linear, parameter-varying (LPV) approaches to control design are discussed. The first method is based on linear fractional transformations which relies on the small gain theorem for bounds on performance and robustness. The other methods make use of either a single (SQLF) or parameter-dependent (PDQLF) quadratic Lyapunov function to bound the achievable level of performance. The latter two techniques are used to synthesize controllers for a high-performance turbofan engine. A LPV model of the turbofan engine is constructed from Jacobian linearizations at fixed power codes for control design. The control problem is formulated as a model matching problem in the H/sub infinity / and LPV framework. The objective is decoupled command response of the closed-loop system to pressure and rotor speed requests. The performance of linear, H/sub infinity / point designs are compared with the SQLF and PDQLF controllers. Nonlinear simulations indicate that the controller synthesized using the SQLF approach is slightly more conservative than the PDQLF controller. Nonlinear simulations with the SQLF and PDQLF controllers show very robust designs that achieve all desired performance objectives", "keyphrases": ["turbofan engine", "linear parameter-varying control", "parameter-dependent control design methods", "state-space descriptions", "time-varying parameters", "linear fractional transformations", "small gain theorem", "performance bounds", "robustness bounds", "single quadratic Lyapunov function", "parameter-dependent quadratic Lyapunov function", "Jacobian linearizations", "decoupled command response", "closed-loop system", "model matching problem", "H/sub infinity / framework", "nonlinear simulations", "very robust designs"]} -{"id": "1638", "title": "The chemical brotherhood", "abstract": "It has always been more difficult for chemistry to keep up in the Internet age but a new language could herald a new era for the discipline. The paper discusses CML, or chemical mark-up language. The eXtensible Mark-up Language provides a universal format for structured documents and data on the Web and so offers a way for scientists and others to carry a wide range of information types across the net in a transparent way. All that is needed is an XML browser", "keyphrases": ["chemistry", "Internet", "CML", "chemical mark-up language", "eXtensible Mark-up Language", "structured document format", "World Wide Web", "XML browser"]} -{"id": "1760", "title": "Dihedral congruence primes and class fields of real quadratic fields", "abstract": "We show that for a real quadratic field F the dihedral congruence primes with respect to F for cusp forms of weight k and quadratic nebentypus are essentially the primes dividing expressions of the form epsilon /sub +//sup k-1/+or-1 where epsilon /sub +/ is a totally positive fundamental unit of F. This extends work of Hida. Our results allow us to identify a family of (ray) class fields of F which are generated by torsion points on modular abelian varieties", "keyphrases": ["dihedral congruence primes", "class fields", "real quadratic fields", "quadratic nebentypus", "torsion points", "modular abelian varieties", "class field theory"]} -{"id": "1725", "title": "Cutting the cord [wireless health care]", "abstract": "More and more healthcare executives are electing to cut the cord to their existing computer systems by implementing mobile technology. The allure of information anywhere, anytime is intoxicating, demonstrated by the cell phones and personal digital assistants (PDAs) that adorn today's professionals. The utility and convenience of these devices is undeniable. But what is the best strategy for implementing a mobile solution within a healthcare enterprise, be it large or small-and under what circumstances? What types of healthcare workers benefit most from mobile technology? And how state-of-the-art is security for wireless applications and devices? These are the questions that healthcare executives are asking-and should be asking-as they evaluate mobile solutions", "keyphrases": ["healthcare", "mobile computing", "wireless computing", "security"]} -{"id": "1461", "title": "Adaptive multiresolution approach for solution of hyperbolic PDEs", "abstract": "This paper establishes an innovative and efficient multiresolution adaptive approach combined with high-resolution methods, for the numerical solution of a single or a system of partial differential equations. The proposed methodology is unconditionally bounded (even for hyperbolic equations) and dynamically adapts the grid so that higher spatial resolution is automatically allocated to domain regions where strong gradients are observed, thus possessing the two desired properties of a numerical approach: stability and accuracy. Numerical results for five test problems are presented which clearly show the robustness and cost effectiveness of the proposed method", "keyphrases": ["multiresolution adaptive approach", "high-resolution methods", "numerical solution", "hyperbolic partial differential equations", "dynamic grid adaptation", "unconditionally bounded methodology", "spatial resolution", "strong gradients", "stability", "accuracy", "robustness", "cost effectiveness"]} -{"id": "1659", "title": "Mobile commerce: transforming the vision into reality", "abstract": "This editorial preface investigates current developments in mobile commerce (M-commerce) and proposes an integrated architecture that supports business and consumer needs in an optimal way to successfully implement M-commerce business processes. The key line of thought is based on the heuristic observation that customers will not want to receive M-commerce offerings to their mobile telephones. As a result, a pull as opposed to a push approach becomes a necessary requirement to conduct M-commerce. In addition, M-commerce has to rely on local, regional, demographic and many other variables to be truly effective. Both observations necessitate an M-commerce architecture that allows the coherent integration of enterprise-level systems as well as the aggregation of product and service offerings from many different and partially competing parties into a collaborative M-commerce platform. The key software component within this integrated architecture is an event management engine to monitor, detect, store, process and measure information about outside events that are relevant to all participants in M-commerce", "keyphrases": ["M-commerce", "mobile commerce", "integrated architecture", "consumer needs", "business needs", "mobile telephones", "pull approach", "collaborative platform", "event management engine"]} -{"id": "148", "title": "Axioms for branching time", "abstract": "Logics of general branching time, or historical necessity, have long been studied but important axiomatization questions remain open. Here the difficulties of finding axioms for such logics are considered and ideas for solving some of the main open problems are presented. A new, more expressive logical account is also given to support Peirce's prohibition on truth values being attached to the contingent future", "keyphrases": ["axioms", "branching time", "truth values", "temporal logic"]} -{"id": "1558", "title": "Orthogonality of the Jacobi polynomials with negative integer parameters", "abstract": "It is well known that the Jacobi polynomials P/sub n//sup ( alpha , beta )/(x) are orthogonal with respect to a quasi-definite linear functional whenever alpha , beta , and alpha + beta + 1 are not negative integer numbers. Recently, Sobolev orthogonality for these polynomials has been obtained for alpha a negative integer and beta not a negative integer and also for the case alpha = beta negative integer numbers. In this paper, we give a Sobolev orthogonality for the Jacobi polynomials in the remainder cases", "keyphrases": ["orthogonality", "quasi-definite linear functional", "Sobolev orthogonality", "Jacobi polynomials", "negative integer parameters"]} -{"id": "1892", "title": "Closed-loop persistent identification of linear systems with unmodeled dynamics and stochastic disturbances", "abstract": "The essential issues of time complexity and probing signal selection are studied for persistent identification of linear time-invariant systems in a closed-loop setting. By establishing both upper and lower bounds on identification accuracy as functions of the length of observation, size of unmodeled dynamics, and stochastic disturbances, we demonstrate the inherent impact of unmodeled dynamics on identification accuracy, reduction of time complexity by stochastic averaging on disturbances, and probing capability of full rank periodic signals for closed-loop persistent identification. These findings indicate that the mixed formulation, in which deterministic uncertainty of system dynamics is blended with random disturbances, is beneficial to reduction of identification complexity", "keyphrases": ["closed-loop persistent identification", "unmodeled dynamics", "linear time-invariant systems", "upper bounds", "lower bounds", "identification accuracy", "full rank periodic signals", "stochastic disturbances", "time complexity", "probing signal selection"]} -{"id": "1744", "title": "Convergence of Toland's critical points for sequences of DC functions and application to the resolution of semilinear elliptic problems", "abstract": "We prove that if a sequence (f/sub n/)/sub n/ of DC functions (difference of two convex functions) converges to a DC function f in some appropriate way and if u/sub n/ is a critical point of f/sub n/, in the sense described by Toland (1978, 1979), and is such that (u/sub n/)/sub n/ converges to u, then u is a critical point of f, still in Toland's sense. We also build a new algorithm which searches for this critical point u and then apply it in order to compute the solution of a semilinear elliptic equation", "keyphrases": ["critical point convergence", "DC function sequences", "semilinear elliptic problems", "convex function difference", "semilinear elliptic equation"]} -{"id": "1701", "title": "Estimation of 3-D left ventricular deformation from medical images using biomechanical models", "abstract": "The quantitative estimation of regional cardiac deformation from three-dimensional (3-D) image sequences has important clinical implications for the assessment of viability in the heart wall. We present here a generic methodology for estimating soft tissue deformation which integrates image-derived information with biomechanical models, and apply it to the problem of cardiac deformation estimation. The method is image modality independent. The images are segmented interactively and then initial correspondence is established using a shape-tracking approach. A dense motion field is then estimated using a transversely isotropic, linear-elastic model, which accounts for the muscle fiber directions in the left ventricle. The dense motion field is in turn used to calculate the deformation of the heart wall in terms of strain in cardiac specific directions. The strains obtained using this approach in open-chest dogs before and after coronary occlusion, exhibit a high correlation with strains produced in the same animals using implanted markers. Further, they show good agreement with previously published results in the literature. This proposed method provides quantitative regional 3-D estimates of heart deformation", "keyphrases": ["3-D left ventricular deformation estimation", "medical diagnostic imaging", "biomechanical models", "regional cardiac deformation", "quantitative estimation", "transversely isotropic linear-elastic model", "cardiac specific directions", "open-chest dogs", "muscle fiber directions", "generic methodology", "interactively segmented images", "3-D image sequences", "nonrigid motion estimation", "magnetic resonance imaging", "left ventricular motion estimation"]} -{"id": "1779", "title": "Maybe it's not too late to join the circus: books for midlife career management", "abstract": "Midcareer librarians looking for career management help on the bookshelf face thousands of choices. This article reviews thirteen popular career self-help books. The reviewed books cover various aspects of career management and provide information on which might be best suited for particular goals, including career change, career tune-up, and personal and professional self-evaluation. The comments reflect issues of interest to midcareer professionals", "keyphrases": ["midlife career management", "librarians", "career self-help books", "career change", "professional self-evaluation", "personal self-evaluation", "libraries"]} -{"id": "1817", "title": "Nonlinear adaptive control via sliding-mode state and perturbation observer", "abstract": "The paper presents a nonlinear adaptive controller (NAC) for single-input single-output feedback linearisable nonlinear systems. A sliding-mode state and perturbation observer is designed to estimate the system states and perturbation which includes the combined effect of system nonlinearities, uncertainties and external disturbances. The NAC design does not require the details of the nonlinear system model and full system states. It possesses an adaptation capability to deal with system parameter uncertainties, unmodelled system dynamics and external disturbances. The convergence of the observer and the stability analysis of the controller/observer system are given. The proposed control scheme is applied for control of a synchronous generator, in comparison with a state-feedback linearising controller (FLC). Simulation study is carried out based on a single-generator infinite-bus power system to show the performance of the controller/observer system", "keyphrases": ["nonlinear adaptive control", "sliding-mode state observer", "perturbation observer", "NAC", "SISO feedback linearisable nonlinear systems", "parameter uncertainties", "unmodelled system dynamics", "external disturbances", "convergence", "synchronous generator control", "state-feedback linearising controller", "FLC", "single-generator infinite-bus power system"]} -{"id": "1485", "title": "Telemedicine in the management of a cervical dislocation by a mobile neurosurgeon", "abstract": "Neurosurgical teams, who are normally located in specialist centres, frequently use teleradiology to make a decision about the transfer of a patient to the nearest neurosurgical department. This decision depends on the type of pathology, the clinical status of the patient and the prognosis. If the transfer of the patient is not possible, for example because of an unstable clinical status, a mobile neurosurgical team may be used. We report a case which was dealt with in a remote French military airborne surgical unit, in the Republic of Chad. The unit, which provides health-care to the French military personnel stationed there, also provides free medical care for the local population. It conducts about 100 operations each month. The unit comprises two surgeons (an orthopaedic and a general surgeon), one anaesthetist, two anaesthetic nurses, one operating room nurse, two nurses, three paramedics and a secretary. The civilian patient presented with unstable cervical trauma. A mobile neurosurgeon operated on her, and used telemedicine before, during and after surgery", "keyphrases": ["cervical dislocation management", "mobile neurosurgeon", "teleradiology", "telemedicine", "remote French military airborne surgical unit", "Republic of Chad", "health care", "French military personnel", "civilian patient", "unstable cervical trauma", "surgery"]} -{"id": "1852", "title": "The design and performance evaluation of alternative XML storage strategies", "abstract": "This paper studies five strategies for storing XML documents including one that leaves documents in the file system, three that use a relational database system, and one that uses an object manager. We implement and evaluate each approach using a number of XQuery queries. A number of interesting insights are gained from these experiments and a summary of the advantages and disadvantages of the approaches is presented", "keyphrases": ["XML document storage", "file system", "relational database system", "object manager", "performance evaluation", "XQuery queries"]} -{"id": "1478", "title": "The effects of work pace on within-participant and between-participant keying force, electromyography, and fatigue", "abstract": "A laboratory study was conducted to determine the effects of work pace on typing force, electromyographic (EMG) activity, and subjective discomfort. We found that as participants typed faster, their typing force and finger flexor and extensor EMG activity increased linearly. There was also an increase in subjective discomfort, with a sharp threshold between participants' self-selected pace and their maximum typing speed. The results suggest that participants self-select a typing pace that maximizes typing speed and minimizes discomfort. The fastest typists did not produce significantly more finger flexor EMG activity but did produce proportionately less finger extensor EMG activity compared with the slower typists. We hypothesize that fast typists may use different muscle recruitment patterns that allow them to be more efficient than slower typists at striking the keys. In addition, faster typists do not experience more discomfort than slow typists. These findings show that the relative pace of typing is more important than actual typing speed with regard to discomfort and muscle activity. These results suggest that typists may benefit from skill training to increase maximum typing speed. Potential applications of this research includes skill training for typists", "keyphrases": ["work pace effect", "EMG activity", "subjective discomfort", "finger flexor", "typing speed", "discomfort", "typists", "muscle recruitment patterns", "keying force", "skill training"]} -{"id": "1784", "title": "CyberEthics bibliography 2002: a select list of recent works", "abstract": "Included in the 2002 annual bibliography update is a select list of recent books and conference proceedings that have been published since 2000. Also included is a select list of special issues of journals and periodicals that were recently published. For additional lists of recently published books and articles, see ibid. (June 2000, June 2001)", "keyphrases": ["CyberEthics bibliography", "2002 annual bibliography", "recent books", "conference proceedings", "special issues", "journals", "periodicals"]} -{"id": "175", "title": "Diagnostic expert system using non-monotonic reasoning", "abstract": "The objective of this work is to develop an expert system for cucumber disorder diagnosis using non-monotonic reasoning to handle the situation when the system cannot reach a conclusion. One reason for this situation is when the information is incomplete. Another reason is when the domain knowledge itself is incomplete. Another reason is when the information is inconsistent. This method maintains the truth of the system in case of changing a piece of information. The proposed method uses two types of non-monotonic reasoning namely: default reasoning and reasoning in the presence of inconsistent information to achieve its goal", "keyphrases": ["diagnostic expert system", "nonmonotonic reasoning", "cucumber disorder diagnosis", "incomplete information", "inconsistent information", "truth maintenance", "default reasoning", "agriculture"]} -{"id": "1520", "title": "Uniform hyperbolic polynomial B-spline curves", "abstract": "This paper presents a new kind of uniform splines, called hyperbolic polynomial B-splines, generated over the space Omega =span{sinh t, cosh t, t/sup k-3/, t/sup k-3/, t/sup k-4/, ..., t 1} in which k is an arbitrary integer larger than or equal to 3. Hyperbolic polynomial B-splines share most of the properties of B-splines in polynomial space. We give subdivision formulae for this new kind of curve and then prove that they have variation diminishing properties and the control polygons of the subdivisions converge. Hyperbolic polynomial B-splines can handle freeform curves as well as remarkable curves such as the hyperbola and the catenary. The generation of tensor product surfaces using these new splines is straightforward. Examples of such tensor product surfaces: the saddle surface, the catenary cylinder, and a certain kind of ruled surface are given", "keyphrases": ["uniform hyperbolic polynomial B-spline curves", "arbitrary integer", "subdivision formulae", "control polygons", "subdivisions", "freeform curves", "hyperbola", "catenary", "tensor product surface generation", "saddle surface", "catenary cylinder", "ruled surface"]} -{"id": "1565", "title": "On lag windows connected with Jacobi polynomials", "abstract": "Lag windows whose corresponding spectral windows are Jacobi polynomials or sums of Jacobi polynomials are introduced. The bias and variance of their spectral density estimators are investigated and their window bandwidth and characteristic exponent are determined", "keyphrases": ["lag windows", "Jacobi polynomials", "spectral windows", "spectral density estimators", "window bandwidth", "characteristic exponent"]} -{"id": "1699", "title": "Time-domain reconstruction for thermoacoustic tomography in a spherical geometry", "abstract": "Reconstruction-based microwave-induced thermoacoustic tomography in a spherical configuration is presented. Thermoacoustic waves from biological tissue samples excited by microwave pulses are measured by a wide-band unfocused ultrasonic transducer, which is set on a spherical surface enclosing the sample. Sufficient data are acquired from different directions to reconstruct the microwave absorption distribution. An exact reconstruction solution is derived and approximated to a modified backprojection algorithm. Experiments demonstrate that the reconstructed images agree well with the original samples. The spatial resolution of the system reaches 0.5 mm", "keyphrases": ["medical diagnostic imaging", "thermoacoustic tomography", "time-domain reconstruction", "modified backprojection algorithm", "exact reconstruction solution", "biological tissue samples", "wide-band unfocused ultrasonic transducer", "spherical surface enclosing sample", "reconstructed images", "system spatial resolution", "spherical geometry", "0.5 mm"]} -{"id": "1621", "title": "Current-mode fully-programmable piece-wise-linear block for neuro-fuzzy applications", "abstract": "A new method to implement an arbitrary piece-wise-linear characteristic in current mode is presented. Each of the breaking points and each slope is separately controllable. As an example a block that implements an N-shaped piece-wise-linearity has been designed. The N-shaped block operates in the subthreshold region and uses only ten transistors. These characteristics make it especially suitable for large arrays of neuro-fuzzy systems where the number of transistors and power consumption per cell is an important concern. A prototype of this block has been fabricated in a 0.35 mu m CMOS technology. The functionality and programmability of this circuit has been verified through experimental results", "keyphrases": ["arbitrary piece-wise-linear characteristic", "current mode", "breaking points", "separately controllable", "N-shaped piece-wise-linearity", "VLSI", "subthreshold region", "neuro-fuzzy systems", "power consumption", "CMOS", "0.35 micron"]} -{"id": "1664", "title": "Disappointment reigns [retail IT]", "abstract": "CPFR remains at the forefront of CIOs' minds, but a number of barriers, such as secretive corporate cultures and spotty data integrity, stand between retail organizations and true supply-chain collaboration. CIOs remain vexed at these obstacles, as was evidenced at a roundtable discussion by retail and consumer-goods IT leaders at the Retail Systems 2002 conference, held in Chicago by the consultancy MoonWatch Media Inc., Newton Upper Falls, Mass. Other annoyances discussed by retail CIOs include poorly designed business processes and retail's poor image with the IT talent emerging from school into the job market", "keyphrases": ["retail", "MoonWatch Media", "Retail Systems 2002 conference", "CIOs", "collaborative planning forecasting and replenishment"]} -{"id": "1598", "title": "A decision support model for selecting product/service benefit positionings", "abstract": "The art (and science) of successful product/service positioning generally hinges on the firm's ability to select a set of attractively priced consumer benefits that are: valued by the buyer, distinctive in one or more respects, believable, deliverable, and sustainable (under actual or potential competitive abilities to imitate, neutralize, or overcome) in the target markets that the firm selects. For many years, the ubiquitous quadrant chart has been used to provide a simple graph of product/service benefits (usually called product/service attributes) described in terms of consumers' perceptions of the importance of attributes (to brand/supplier choice) and the performance of competing firms on these attributes. This paper describes a model that extends the quadrant chart concept to a decision support system that optimizes a firm's market share for a specified product/service. In particular, we describe a decision support model that utilizes relatively simple marketing research data on consumers' judged benefit importances, and supplier performances on these benefits to develop message components for specified target buyers. A case study is used to illustrate the model. The study deals with developing advertising message components for a relatively new entrant in the US air shipping market. We also discuss, more briefly, management reactions to application of the model to date, and areas for further research and model extension", "keyphrases": ["product/service benefit positionings", "decision support model", "attractively priced consumer benefits", "quadrant chart", "simple graph", "product/service attributes", "brand/supplier choice", "market share optimization", "marketing research data", "consumer judged benefit importances", "message components", "advertising message components", "US air shipping market", "management reactions", "advertising", "greedy heuristic", "optimal message design"]} -{"id": "188", "title": "Sampled-data implementation of a gain scheduled controller", "abstract": "A continuous-time gain-scheduled controller must be transformed to a corresponding discrete-time controller for sampled-data implementation. We show that certain linearization properties of a continuous-time gain scheduled controller are inherited by its sampled-data implementation. We also show that a similar relationship exists for multi-rate gain scheduled controllers arising in flight control applications", "keyphrases": ["gain scheduled controller", "sampled-data implementation", "continuous-time gain-scheduled controller", "discrete-time controller", "linearization properties", "multi-rate gain scheduled controllers", "flight control applications"]} -{"id": "1786", "title": "A humane tool for aiding computer science advisors, computer science students, and parents", "abstract": "Over the past few years, the computer science department faculty at Baylor has observed that some students who perform adequately during the freshman and sophomore years have substantial difficulty during the junior and senior years of study. Baylor University is an institution committed to being caring of its students. The objective for this study grew out of these two realities. There are three objectives of this research. One objective is to identify students, no later than the sophomore year, who are less likely to succeed as computer science majors. A second objective is to accomplish this identification by using data from seniors majoring in computer science. A third objective is to begin to use this information at the end of their sophomore year when meeting with a computer science faculty advisor. A regression study is conducted on the data from all students classified as seniors, majoring in computer science in May 2001, showing grades in six freshman and sophomore courses, and showing grades for at least five junior or senior level computer science courses. These students and their course performance data constituted the study sample", "keyphrases": ["humane tool", "computer science advisors", "computer science students", "parents", "Baylor University", "student care", "sophomore year", "computer science majors", "regression study", "course performance data"]} -{"id": "1815", "title": "Control of integral processes with dead-time. 1. Disturbance observer-based 2 DOF control scheme", "abstract": "A disturbance observer-based control scheme (a version of 2 DOF internal model control) which is very effective in controlling integral processes with dead time is presented. The controller can be designed to reject ramp disturbances as well as step disturbances and even arbitrary disturbances. When the plant model is available only two parameters are left to tune. One is the time constant of the set-point response and the other is the time constant of the disturbance response. The latter is tuned according to the compromise between disturbance response and robustness. This control scheme has a simple, clear, easy-to-design, easy-to-implement structure and good performance. It is compared to the best results (so far) using some simulation examples", "keyphrases": ["integral processes", "dead-time", "disturbance observer-based 2 DOF control scheme", "2 DOF internal model control", "ramp disturbances rejection", "set-point response", "time constant", "disturbance response", "robustness"]} -{"id": "1487", "title": "Assessment of prehospital chest pain using telecardiology", "abstract": "Two hundred general practitioners were equipped with a portable electrocardiograph which could transmit a 12-lead electrocardiogram (ECG) via a telephone line. A cardiologist was available 24 h a day for an interactive teleconsultation. In a 13 month period there were 5073 calls to the telecardiology service and 952 subjects with chest pain were identified. The telecardiology service allowed the general practitioners to manage 700 cases (74%) themselves; further diagnostic tests were requested for 162 patients (17%) and 83 patients (9%) were sent to the hospital emergency department. In the last group a cardiological diagnosis was confirmed in 60 patients and refuted in 23. Seven patients in whom the telecardiology service failed to detect a cardiac problem were hospitalized in the subsequent 48 h. The telecardiology service showed a sensitivity of 97.4%, a specificity of 89.5% and a diagnostic accuracy of 86.9% for chest pain. Telemedicine could be a useful tool in the diagnosis of chest pain in primary care", "keyphrases": ["prehospital chest pain assessment", "telecardiology", "general practitioners", "portable electrocardiograph", "electrocardiogram transmission", "telephone line", "interactive teleconsultation", "patients", "diagnostic tests", "hospital emergency department", "sensitivity", "specificity", "diagnostic accuracy", "primary care", "13 month"]} -{"id": "1850", "title": "The n-tier hub technology", "abstract": "During 2001, the Enterprise Engineering Laboratory at George Mason University was contracted by the Boeing Company to develop an eHub capability for aerospace suppliers in Taiwan. In a laboratory environment, the core technology was designed, developed, and tested, and now a large first-tier aerospace supplier in Taiwan is commercializing the technology. The project objective was to provide layered network and application services for transporting XML-based business transaction flows across multi-tier, heterogeneous data processing environments. This paper documents the business scenario, the eHub application, and the network transport mechanisms that were used to build the n-tier hub. In contrast to most eHubs, this solution takes the point of view of suppliers, pushing data in accordance with supplier requirements; hence, enhancing the probability of supplier adoption. The unique contribution of this project is the development of an eHub that meets the needs of small and medium enterprises (SMEs) and first-tier suppliers", "keyphrases": ["n-tier hub technology", "aerospace suppliers", "Boeing Company", "Taiwan", "XML-based business transaction flows", "multi-tier heterogeneous data processing environments", "business scenario", "network transport mechanisms", "supplier adoption", "small and medium enterprises", "first-tier suppliers"]} -{"id": "1908", "title": "Explicit solutions for transcendental equations", "abstract": "A simple method to formulate an explicit expression for the roots of any analytic transcendental function is presented. The method is based on Cauchy's integral theorem and uses only basic concepts of complex integration. A convenient method for numerically evaluating the exact expression is presented. The application of both the formulation and evaluation of the exact expression is illustrated for several classical root finding problems", "keyphrases": ["analytic functions", "transcendental equations", "Cauchy integral theorem", "complex integration", "root finding", "singularity", "polynomial", "Fourier transform"]} -{"id": "1623", "title": "Transmission of real-time video over IP differentiated services", "abstract": "Multimedia applications require high bandwidth and guaranteed quality of service (QoS). The current Internet, which provides 'best effort' services, cannot meet the stringent QoS requirements for delivering MPEG videos. It is proposed that MPEG frames are transported through various service models of DiffServ. Performance analysis and simulation results show that the proposed approach can not only guarantee QoS but can also achieve high bandwidth utilisation", "keyphrases": ["IP differentiated services", "real-time video transmission", "multimedia applications", "quality of service", "QoS guarantees", "Internet", "MPEG video", "DiffServ", "high bandwidth utilisation"]} -{"id": "1666", "title": "Airline base schedule optimisation by flight network annealing", "abstract": "A system for rigorous airline base schedule optimisation is described. The architecture of the system reflects the underlying problem structure. The architecture is hierarchical consisting of a master problem for logical aircraft schedule optimisation and a sub-problem for schedule evaluation. The sub-problem is made up of a number of component sub-problems including connection generation, passenger choice modelling, passenger traffic allocation by simulation and revenue and cost determination. Schedule optimisation is carried out by means of simulated annealing of flight networks. The operators for the simulated annealing process are feasibility preserving and form a complete set of operators", "keyphrases": ["airline base schedule optimisation", "flight network annealing", "system architecture", "hierarchical architecture", "master problem", "logical aircraft schedule optimisation", "schedule evaluation", "connection generation", "passenger choice modelling", "passenger traffic allocation", "cost determination", "simulated annealing", "operators", "time complexity"]} -{"id": "177", "title": "Turning telecommunications call details to churn prediction: a data mining approach", "abstract": "As deregulation, new technologies, and new competitors open up the mobile telecommunications industry, churn prediction and management has become of great concern to mobile service providers. A mobile service provider wishing to retain its subscribers needs to be able to predict which of them may be at-risk of changing services and will make those subscribers the focus of customer retention efforts. In response to the limitations of existing churn-prediction systems and the unavailability of customer demographics in the mobile telecommunications provider investigated, we propose, design, and experimentally evaluate a churn-prediction technique that predicts churning from subscriber contractual information and call pattern changes extracted from call details. This proposed technique is capable of identifying potential churners at the contract level for a specific prediction time-period. In addition, the proposed technique incorporates the multi-classifier class-combiner approach to address the challenge of a highly skewed class distribution between churners and non-churners. The empirical evaluation results suggest that the proposed call-behavior-based churn-prediction technique exhibits satisfactory predictive effectiveness when more recent call details are employed for the churn prediction model construction. Furthermore, the proposed technique is able to demonstrate satisfactory or reasonable predictive power within the one-month interval between model construction and churn prediction. Using a previous demographics-based churn-prediction system as a reference, the lift factors attained by our proposed technique appear largely satisfactory", "keyphrases": ["telecommunications call details", "mobile telecommunications industry", "mobile service providers", "deregulation", "customer retention efforts", "customer demographics", "subscriber contractual information", "call pattern changes", "multi-classifier class-combiner approach", "skewed class distribution", "lift factors", "decision tree induction"]} -{"id": "1522", "title": "Waltzing through Port 80 [Web security]", "abstract": "Web services follow the trusting model of the Internet, but allow ever more powerful payloads to travel between businesses and consumers. Before you leap online, the author advises to scan the security concerns and the available fixes. He looks at how we define and store Web services and incorporate them into business processes", "keyphrases": ["Web services", "Internet", "trust", "data security", "business processes"]} -{"id": "1567", "title": "Asymptotic expansions for the zeros of certain special functions", "abstract": "We derive asymptotic expansions for the zeros of the cosine-integral Ci(x) and the Struve function H/sub 0/(x), and extend the available formulae for the zeros of Kelvin functions. Numerical evidence is provided to illustrate the accuracy of the expansions", "keyphrases": ["asymptotic expansions", "zeros", "cosine-integral", "Struve function", "Kelvin functions", "accuracy"]} -{"id": "18", "title": "Differential and integral calculus on discrete time series data", "abstract": "It has been found that discontinuity plays a crucial role in natural evolutions (Lin 1998). In this presentation, we will generalize the idea of integration and differentiation, we developed in calculus, to the study of time series in the hope that the problem of outliers and discontinuities can be resolved more successfully than simply deleting the outliers and avoiding discontinuities from the overall data analysis. In general, appearances of outliers tend to mean existence of discontinuities, explosive growth or decline in the evolution. At the same time, our approach can be employed to partially overcome the problem of not having enough data values in any available time series. At the end, we will look at some real-life problems of prediction in order to see the power of this new approach", "keyphrases": ["natural evolutions", "integration", "differentiation", "time series", "outliers", "prediction"]} -{"id": "1828", "title": "Exploiting structure in quantified formulas", "abstract": "We study the computational problem \"find the value of the quantified formula obtained by quantifying the variables in a sum of terms.\" The \"sum\" can be based on any commutative monoid, the \"quantifiers\" need only satisfy two simple conditions, and the variables can have any finite domain. This problem is a generalization of the problem \"given a sum-of-products of terms, find the value of the sum\" studied by R.E. Stearns and H.B. Hunt III (1996). A data structure called a \"structure tree\" is defined which displays information about \"subproblems\" that can be solved independently during the process of evaluating the formula. Some formulas have \"good\" structure trees which enable certain generic algorithms to evaluate the formulas in significantly less time than by brute force evaluation. By \"generic algorithm,\" we mean an algorithm constructed from uninterpreted function symbols, quantifier symbols, and monoid operations. The algebraic nature of the model facilitates a formal treatment of \"local reductions\" based on the \"local replacement\" of terms. Such local reductions \"preserve formula structure\" in the sense that structure trees with nice properties transform into structure trees with similar properties. These local reductions can also be used to transform hierarchical specified problems with useful structure into hierarchically specified problems having similar structure", "keyphrases": ["quantified formulas", "structure exploitation", "commutative monoid", "data structure", "structure tree", "satisfiability problems", "constraint satisfaction problems", "dynamic programming", "computational complexity", "generic algorithms", "function symbols", "quantifier symbols", "monoid operations", "hierarchically specified problems"]} -{"id": "1746", "title": "The exact solution of coupled thermoelectroelastic behavior of piezoelectric laminates", "abstract": "Exact solutions for static analysis of thermoelectroelastic laminated plates are presented. In this analysis, a new concise procedure for the analytical solution of composite laminated plates with piezoelectric layers is developed. A simple eigenvalue formula in real number form is directly developed from the basic coupled piezoelectric differential equations and the difficulty of treating imaginary eigenvalues is avoided. The solution is defined in the trigonometric series and can be applied to thin and thick plates. Numerical studies are conducted on a five-layer piezoelectric plate and the complexity of stresses and deformations under combined loading is illustrated. The results could be used as a benchmark for assessing any numerical solution by approximate approaches such as the finite element method while also providing useful physical insight into the behavior of piezoelectric plates in a thermal environment", "keyphrases": ["exact solution", "coupled thermoelectroelastic behavior", "piezoelectric laminates", "thermoelectroelastic laminated plates", "analytical solution", "composite laminated plates", "piezoelectric layers", "eigenvalue formula", "real number form", "coupled piezoelectric differential equations", "trigonometric series", "thin plates", "thick plates", "five-layer piezoelectric plate", "numerical study", "stresses", "deformations", "combined loading", "finite element method", "thermal environment"]} -{"id": "1703", "title": "Statistical analysis of nonlinearly reconstructed near-infrared tomographic images. II. Experimental interpretation", "abstract": "For pt. I see ibid., vol. 21, no. 7, p. 755-63 (2002). Image error analysis of a diffuse near-infrared tomography (NIR) system has been carried out on simulated data using a statistical approach described in pt. I of this paper (Pogue et al., 2002). The methodology is used here with experimental data acquired on phantoms with a prototype imaging system intended for characterizing breast tissue. Results show that imaging performance is not limited by random measurement error, but rather by calibration issues. The image error over the entire field of view is generally not minimized when an accurate homogeneous estimate of the phantom properties is available; however, local image error over a target region of interest (ROI) is reduced. The image reconstruction process which includes a Levenberg-Marquardt style regularization provides good minimization of the objective function, yet its reduction is not always correlated with an overall image error decrease. Minimization of the bias in an ROI which contains localized changes in the optical properties can be achieved through five to nine iterations of the algorithm. Precalibration of the algorithm through statistical evaluation of phantom studies may provide a better measure of the image accuracy than that implied by minimization of the standard objective function", "keyphrases": ["medical diagnostic imaging", "nonlinearly reconstructed near-infrared tomographic images", "image error", "algorithm precalibration", "hemoglobin", "random measurement error", "target region of interest", "accurate homogeneous estimate", "phantom properties", "Levenberg-Marquardt style regularization", "bias minimization", "algorithm iterations", "objective function minimization"]} -{"id": "1890", "title": "Robustness of trajectories with finite time extent", "abstract": "The problem of estimating perturbation bounds of finite trajectories is considered. The trajectory is assumed to be generated by a linear system with uncertainty characterized in terms of integral quadratic constraints. It is shown that such perturbation bounds can be obtained as the solution to a nonconvex quadratic optimization problem, which can be addressed using Lagrange relaxation. The result can be used in robustness analysis of hybrid systems and switched dynamical systems", "keyphrases": ["trajectories robustness", "finite time extent", "perturbation bounds", "linear system", "uncertainty", "integral quadratic constraints", "nonconvex quadratic optimization problem", "Lagrange relaxation", "robustness analysis", "hybrid systems", "switched dynamical systems"]} -{"id": "1911", "title": "Pulmonary perfusion patterns and pulmonary arterial pressure", "abstract": "Uses artificial intelligence methods to determine whether quantitative parameters describing the perfusion image can be synthesized to make a reasonable estimate of the pulmonary arterial (PA) pressure measured at angiography. Radionuclide perfusion images were obtained in 120 patients with normal chest radiographs who also underwent angiographic PA pressure measurement within 3 days of the radionuclide study. An artificial neural network (ANN) was constructed from several image parameters describing statistical and boundary characteristics of the perfusion images. With use of a leave-one-out cross-validation technique, this method was used to predict the PA systolic pressure in cases on which the ANN had not been trained. A Pearson correlation coefficient was determined between the predicted and measured PA systolic pressures. ANN predictions correlated with measured pulmonary systolic pressures (r=0.846, P<.001). The accuracy of the predictions was not influenced by the presence of pulmonary embolism. None of the 51 patients with predicted PA pressures of less than 29 mm Hg had pulmonary hypertension at angiography. All 13 patients with predicted PA pressures greater than 48 mm Hg had pulmonary hypertension at angiography. Meaningful information regarding PA pressure can be derived from noninvasive radionuclide perfusion scanning. The use of image analysis in concert with artificial intelligence methods helps to reveal physiologic information not readily apparent at visual image inspection", "keyphrases": ["pulmonary perfusion patterns", "angiographic pulmonary arterial pressure measurement", "artificial neural network predictions", "accuracy", "pulmonary embolism", "pulmonary hypertension", "noninvasive radionuclide perfusion scanning", "image analysis", "physiologic information", "visual image inspection", "image parameters", "statistical characteristics", "boundary characteristics", "leave-one-out cross-validation technique", "pulmonary arterial systolic pressure", "Pearson correlation coefficient", "artificial intelligence methods", "quantitative parameters", "perfusion image", "angiography", "radionuclide perfusion images", "patients", "normal chest radiographs", "29 Pa", "48 Pa"]} -{"id": "1583", "title": "Cutting through the confusion [workflow & content management]", "abstract": "Information management vendors are rushing to re-position themselves and put a portal spin on their products, says ITNET's Graham Urquhart. The result is confusion, with a range of different definitions and claims clouding the true picture", "keyphrases": ["ITNET", "portals", "collaboratively", "workflow"]} -{"id": "1682", "title": "Data mining efforts increase business productivity and efficiency", "abstract": "The use and acquisition of information is a key part of the way any business makes money. Data mining technologies provide greater insight into how this information can be better used and more effectively acquired. Steven Kudyba, an expert in the field of data mining technologies, shares his expertise in an interview", "keyphrases": ["data mining", "productivity", "efficiency"]} -{"id": "1463", "title": "Computational complexity of probabilistic disambiguation", "abstract": "Recent models of natural language processing employ statistical reasoning for dealing with the ambiguity of formal grammars. In this approach, statistics, concerning the various linguistic phenomena of interest, are gathered from actual linguistic data and used to estimate the probabilities of the various entities that are generated by a given grammar, e.g., derivations, parse-trees and sentences. The extension of grammars with probabilities makes it possible to state ambiguity resolution as a constrained optimization formula, which aims at maximizing the probability of some entity that the grammar generates given the input (e.g., maximum probability parse-tree given some input sentence). The implementation of these optimization formulae in efficient algorithms, however, does not always proceed smoothly. In this paper, we address the computational complexity of ambiguity resolution under various kinds of probabilistic models. We provide proofs that some, frequently occurring problems of ambiguity resolution are NP-complete. These problems are encountered in various applications, e.g., language understanding for textand speech-based applications. Assuming the common model of computation, this result implies that, for many existing probabilistic models it is not possible to devise tractable algorithms for solving these optimization problems", "keyphrases": ["natural language processing", "statistical reasoning", "formal grammars", "statistics", "computational complexity", "probabilistic disambiguation", "NP-completeness results", "parsing problems", "speech processing", "state ambiguity resolution", "constrained optimization formula", "probabilistic models", "language understanding"]} -{"id": "1762", "title": "Laguerre pseudospectral method for nonlinear partial differential equations", "abstract": "The Laguerre Gauss-Radau interpolation is investigated. Some approximation results are obtained. As an example, the Laguerre pseudospectral scheme is constructed for the BBM equation. The stability and the convergence of proposed scheme are proved. The numerical results show the high accuracy of this approach", "keyphrases": ["Laguerre pseudospectral method", "nonlinear partial differential equations", "Laguerre Gauss-Radau interpolation", "approximation results", "BBM equation", "stability", "numerical results", "nonlinear differential equations"]} -{"id": "1727", "title": "Linguistic knowledge and new technologies", "abstract": "Modern language studies are characterized by a variety of forms, ways, and methods of their development. In this connection, it is necessary to specify the problem of the development of their internal differentiation and classification, which lead to the formation of specific areas knowledge. An example of such an area is speechology-a field of science belonging to fundamental, theoretical, and applied linguistics", "keyphrases": ["modern language studies", "internal differentiation", "internal classification", "speechology", "applied linguistics", "theoretical linguistics", "fundamental linguistics", "linguistic knowledge"]} -{"id": "1849", "title": "An active functionality service for e-business applications", "abstract": "Service based architectures are a powerful approach to meet the fast evolution of business rules and the corresponding software. An active functionality service that detects events and involves the appropriate business rules is a critical component of such a service-based middleware architecture. In this paper we present an active functionality service that is capable of detecting events in heterogeneous environments, it uses an integral ontology-based approach for the semantic interpretation of heterogeneous events and data, and provides notifications through a publish/subscribe notification mechanism. The power of this approach is illustrated with the help of an auction application and through the personalization of car and driver portals in Internet-enabled vehicles", "keyphrases": ["active functionality service", "e-business applications", "business rules", "software", "event detection", "service-based middleware architecture", "heterogeneous environments", "ontology based approach", "semantic interpretation", "publish/subscribe notification mechanism", "auction application", "personalized car portals", "personalized driver portals", "Internet-enabled vehicles"]} -{"id": "1831", "title": "Fast broadcasting and gossiping in radio networks", "abstract": "We establish an O(n log/sup 2/ n) upper bound on the time for deterministic distributed broadcasting in multi-hop radio networks with unknown topology. This nearly matches the known lower bound of Omega (n log n). The fastest previously known algorithm for this problem works in time O(n/sup 3/2/). Using our broadcasting algorithm, we develop an O(n/sup 3/2/ log/sup 2/ n) algorithm for gossiping in the same network model", "keyphrases": ["fast broadcasting", "upper bound", "deterministic distributed broadcasting", "gossiping", "radio networks"]} -{"id": "1874", "title": "E - a brainiac theorem prover", "abstract": "We describe the superposition-based theorem prover E. E is a sound and complete prover for clausal first order logic with equality. Important properties of the prover include strong redundancy elimination criteria, the DISCOUNT loop proof procedure, a very flexible interface for specifying search control heuristics, and an efficient inference engine. We also discuss the strengths and weaknesses of the system", "keyphrases": ["brainiac theorem prover", "CASC", "superposition-based theorem prover", "E automatic theorem prover", "rewriting", "completeness", "soundness", "clausal first order logic", "equality", "strong redundancy elimination criteria", "DISCOUNT", "CADE ATP System Competitions", "loop proof procedure", "search control heuristics", "inference engine"]} -{"id": "1889", "title": "Sliding mode dynamics in continuous feedback control for distributed discrete-event scheduling", "abstract": "A continuous feedback control approach for real-time scheduling of discrete events is presented motivated by the need for control theoretic techniques to analyze and design such systems in distributed manufacturing applications. These continuous feedback control systems exhibit highly nonlinear and discontinuous dynamics. Specifically, when the production demand in the manufacturing system exceeds the available resource capacity then the control system \"chatters\" and exhibits sliding modes. This sliding mode behavior is advantageously used in the scheduling application by allowing the system to visit different schedules within an infinitesimal region near the sliding surface. In the paper, an analytical model is developed to characterize the sliding mode dynamics. This model is then used to design controllers in the sliding mode domain to improve the effectiveness of the control system to \"search\" for schedules with good performance. Computational results indicate that the continuous feedback control approach can provide near-optimal schedules and that it is computationally efficient compared to existing scheduling techniques", "keyphrases": ["sliding mode dynamics", "continuous feedback control", "distributed discrete-event scheduling", "real-time scheduling", "control theoretic techniques", "distributed manufacturing applications", "highly nonlinear discontinuous dynamics", "production demand", "resource capacity"]} -{"id": "153", "title": "On the relationship between omega -automata and temporal logic normal forms", "abstract": "We consider the relationship between omega -automata and a specific logical formulation based on a normal form for temporal logic formulae. While this normal form was developed for use with execution and clausal resolution in temporal logics, we show how it can represent, syntactically, omega -automata in a high-level way. Technical proofs of the correctness of this representation are given", "keyphrases": ["omega -automata", "temporal logic normal forms", "logical formulation", "clausal resolution", "program correctness"]} -{"id": "1506", "title": "Intelligent control of life support for space missions", "abstract": "Future manned space operations will include a greater use of automation than we currently see. For example, semiautonomous robots and software agents will perform difficult tasks while operating unattended most of the time. As these automated agents become more prevalent, human contact with them will occur more often and become more routine, so designing these automated agents according to the principles of human-centered computing is important. We describe two cases of semiautonomous control software developed and fielded in test environments at the NASA Johnson Space Center. This software operated continuously at the JSC and interacted closely with humans for months at a time", "keyphrases": ["life support", "software agents", "semiautonomous robots", "space missions", "intelligent control", "manned space operations", "automation", "automated agents", "semiautonomous control software", "NASA Johnson Space Center", "crew air regeneration", "crew water recovery", "human intervention"]} -{"id": "1543", "title": "RISCy business. Part 1: RISC projects by Cornell students", "abstract": "The author looks at several projects that Cornell University students entered in the Atmel Design 2001 contest. Those covered include a vertical plotter; BiLines, an electronic game; a wireless Internet pager; Cooking Coach; Barbie's zip drive; and a model train controller", "keyphrases": ["Atmel's Design Logic 2001 contest", "RISC projects", "Cornell students", "vertical plotter", "BiLines", "electronic game", "wireless Internet pager", "Cooking Coach", "Barbie's zip drive", "model train controller"]} -{"id": "1607", "title": "A solvable queueing network model for railway networks and its validation and applications for the Netherlands", "abstract": "The performance of new railway networks cannot be measured or simulated, as no detailed train schedules are available. Railway infrastructure and capacities are to be determined long before the actual traffic is known. This paper therefore proposes a solvable queueing network model to compute performance measures of interest without requiring train schedules (timetables). Closed form expressions for mean delays are obtained. New network designs, traffic scenarios, and capacity expansions can so be evaluated. A comparison with real delay data for the Netherlands supports the practical value of the model. A special Dutch cargo-line application is included", "keyphrases": ["railway networks", "solvable queueing network model", "Netherlands", "railway infrastructure", "railway capacities", "performance measures", "closed form expressions", "mean delays", "network designs", "traffic scenarios", "capacity expansions", "Dutch cargo-line application"]} -{"id": "1642", "title": "Development and validation of user-adaptive navigation and information retrieval tools for an intranet portal organizational memory information system", "abstract": "Based on previous research and properties of organizational memory, a conceptual model for navigation and retrieval functions in an intranet portal organizational memory information system was proposed, and two human-centred features (memory structure map and history-based tool) were developed to support user's navigation and retrieval in a well-known organizational memory. To test two hypotheses concerning the validity of the conceptual model and two human-centred features, an experiment was conducted with 30 subjects. Testing of the two hypotheses indicated the following: (1) the memory structure map's users showed 29% better performance in navigation, and (2) the history-based tool's users outperformed by 34% in identifying information. The results of the study suggest that a conceptual model and two human-centred features could be used in a user-adaptive interface design to improve user's performance in an intranet portal organizational memory information system", "keyphrases": ["user-adaptive navigation", "information retrieval tools", "intranet portal", "organizational memory information system", "conceptual model", "human factors", "memory structure map", "history-based tool", "experiment", "user-adaptive interface design", "user performance"]} -{"id": "1766", "title": "A note on vector cascade algorithm", "abstract": "The focus of this paper is on the relationship between accuracy of multivariate refinable vector and vector cascade algorithm. We show that, if the vector cascade algorithm (1.5) with isotropic dilation converges to a vector-valued function with regularity, then the initial function must satisfy the Strang-Fix conditions", "keyphrases": ["vector cascade algorithm", "multivariate refinable vector", "matrix algebra", "isotropic dilation", "vector-valued function", "Strang-fix conditions"]} -{"id": "1723", "title": "Positive productivity, better billing [health care]", "abstract": "Workflow software provides the right communication solution for hospital specialists, and delivers an unexpected financial boost too", "keyphrases": ["health care", "San Francisco General Hospital", "ProVation MD", "workflow software"]} -{"id": "1808", "title": "Nonlinearities in NARX polynomial models: representation and estimation", "abstract": "It is shown how nonlinearities are mapped in NARX polynomial models. General expressions are derived for the gain and eigenvalue functions in terms of the regressors and coefficients of NARX models. Such relationships are useful in grey-box identification problems. The results are illustrated using simulated and real data", "keyphrases": ["NARX polynomial model nonlinearities", "nonlinearity representation", "nonlinearity estimation", "gain functions", "eigenvalue functions", "regressors", "grey-box identification problems", "nonlinear autoregressive exogenous-input polynomial model"]} -{"id": "1467", "title": "Utilizing Web-based case studies for cutting-edge information services issues", "abstract": "This article reports on a pilot study conducted by the Academic Libraries of the 21st Century project team to determine whether the benefits of the case study method as a training framework for change initiatives could successfully transfer from the traditional face-to-face format to a virtual format. Methods of developing the training framework, as well as the benefits, challenges, and recommendations for future strategies gained from participant feedback are outlined. The results of a survey administered to chat session registrants are presented in three sections: (1) evaluation of the training framework; (2) evaluation of participants' experiences in the virtual environment; and (3) a comparison of participants' preference of format. The overall participant feedback regarding the utilization of the case study method in a virtual environment for professional development and collaborative problem solving is very positive", "keyphrases": ["Web-based case studies", "cutting-edge information services", "academic libraries", "training", "change initiatives", "survey", "virtual environment", "professional development", "Internet", "collaborative problem solving"]} -{"id": "1686", "title": "Internet infrastructure and the emerging information society: an appraisal of the Internet backbone industry", "abstract": "This paper examines the real constraints to the expansion of all encumbering and all pervasive information technology in our contemporary society. Perhaps the U.S. Internet infrastructure is the most appropriate to examine since it is U.S. technology that has led the world into the Internet age. In this context, this paper reviews the state of the U.S. Internet backbone that will lead us into information society of the future by facilitating massive data transmission", "keyphrases": ["Internet infrastructure", "Internet service providers", "users", "backbone companies", "local telephone companies"]} -{"id": "1915", "title": "Multichannel scaler for general statistical analysis of dynamic light scattering", "abstract": "A four channel scaler for counting applications has been designed and built using a standard high transfer rate parallel computer interface bus parallel data card. The counter section is based on standard complex programmable logic device integrated circuits. With a 200 MHz Pentium based host PC a sustained counting and data transfer with channel widths as short as 200 ns for a single channel is realized. The use of the multichannel scaler is demonstrated in dynamic light scattering experiments. The recorded traces are analyzed with wavelet and other statistical techniques to obtain transient changes in the properties of the scattered light", "keyphrases": ["multichannel scaler", "general statistical analysis", "dynamic light scattering", "correlation spectroscopy", "optical spectroscopic techniques", "photon signal statistical properties", "four channel scaler", "standard high transfer rate parallel computer interface", "interface bus parallel data card", "complex programmable logic device", "standard CPLD ICs", "Pentium based host PC", "windowed Fourier transform", "200 MHz", "200 ns"]} -{"id": "1603", "title": "Exploiting structure in adaptive dynamic programming algorithms for a stochastic batch service problem", "abstract": "The purpose of this paper is to illustrate the importance of using structural results in dynamic programming algorithms. We consider the problem of approximating optimal strategies for the batch service of customers at a service station. Customers stochastically arrive at the station and wait to be served, incurring a waiting cost and a service cost. Service of customers is performed in groups of a fixed service capacity. We investigate the structure of cost functions and establish some theoretical results including monotonicity of the value functions. Then, we use our adaptive dynamic programming monotone algorithm that uses structure to preserve monotonicity of the estimates at each iterations to approximate the value functions. Since the problem with homogeneous customers can be solved optimally, we have a means of comparison to evaluate our heuristic. Finally, we compare our algorithm to classical forward dynamic programming methods", "keyphrases": ["stochastic batch service problem", "adaptive dynamic programming algorithms", "structural results", "optimal strategy approximation", "service station", "waiting cost", "service cost", "fixed service capacity", "cost function structure", "value function monotonicity", "inventory theory"]} -{"id": "1646", "title": "The limits of shape constancy: point-to-point mapping of perspective projections of flat figures", "abstract": "The present experiments investigate point-to-point mapping of perspective transformations of 2D outline figures under diverse viewing conditions: binocular free viewing, monocular perspective with 2D cues masked by an optic tunnel, and stereoptic viewing through an optic tunnel. The first experiment involved upright figures, and served to determine baseline point-to-point mapping accuracy, which was found to be very good. Three shapes were used: square, circle and irregularly round. The main experiment, with slanted figures, involved only two shapes-square and irregularly shaped-showed at several slant degrees. Despite the accumulated evidence for shape constancy when the outline of perspective projections is considered, metric perception of the inner structure of such projections was quite limited. Systematic distortions were found, especially with more extreme slants, and attributed to the joint effect of several factors: anchors, 3D information, and slant underestimation. Contradictory flatness cues did not detract from performance, while stereoptic information improved it", "keyphrases": ["shape constancy", "point-to-point mapping", "flat figure perspective projections", "experiments", "2D outline figures", "diverse viewing conditions", "binocular free viewing", "monocular perspective", "2D cues", "optic tunnel", "stereoptic viewing", "3D shape perception", "human factors", "3D information displays", "anchors", "3D information", "slant underestimation"]} -{"id": "1928", "title": "Solution of a Euclidean combinatorial optimization problem by the dynamic-programming method", "abstract": "A class of Euclidean combinatorial optimization problems is selected that can be solved by the dynamic programming method. The problem of allocation of servicing enterprises is solved as an example", "keyphrases": ["Euclidean combinatorial optimization problem", "dynamic programming method"]} -{"id": "157", "title": "Automatic extraction of eye and mouth fields from a face image using eigenfeatures and ensemble networks", "abstract": "This paper presents a novel algorithm for the extraction of the eye and mouth (facial features) fields from 2D gray level images. Eigenfeatures are derived from the eigenvalues and eigenvectors of the binary edge data set constructed from eye and mouth fields. Such eigenfeatures are ideal features for finely locating fields efficiently. The eigenfeatures are extracted from a set of the positive and negative training samples for facial features and are used to train a multilayer perceptron (MLP) whose output indicates the degree to which a particular image window contains the eyes or the mouth within itself. An ensemble network consisting of a multitude of independent MLPs was used to enhance the generalization performance of a single MLP. It was experimentally verified that the proposed algorithm is robust against facial size and even slight variations of the pose", "keyphrases": ["eye field extraction", "mouth field extraction", "face feature extraction", "2D gray level images", "eigenvalues", "eigenvectors", "binary edge data set", "training samples", "multilayer perceptron", "generalization", "experiment", "eigenfeatures", "ensemble neural networks"]} -{"id": "1502", "title": "Mining open answers in questionnaire data", "abstract": "Surveys are important tools for marketing and for managing customer relationships; the answers to open-ended questions, in particular, often contain valuable information and provide an important basis for business decisions. The summaries that human analysts make of these open answers, however, tend to rely too much on intuition and so aren't satisfactorily reliable. Moreover, because the Web makes it so easy to take surveys and solicit comments, companies are finding themselves inundated with data from questionnaires and other sources. Handling it all manually would be not only cumbersome but also costly. Thus, devising a computer system that can automatically mine useful information from open answers has become an important issue. We have developed a survey analysis system that works on these principles. The system mines open answers through two statistical learning techniques: rule learning (which we call rule analysis) and correspondence analysis", "keyphrases": ["natural language response analysis", "survey analysis", "text mining system", "questionnaire data", "statistical learning techniques", "rule analysis", "correspondence analysis", "open answer mining"]} -{"id": "1547", "title": "New projection-type methods for monotone LCP with finite termination", "abstract": "In this paper we establish two new projection-type methods for the solution of the monotone linear complementarity problem (LCP). The methods are a combination of the extragradient method and the Newton method, in which the active set strategy is used and only one linear system of equations with lower dimension is solved at each iteration. It is shown that under the assumption of monotonicity, these two methods are globally and linearly convergent. Furthermore, under a nondegeneracy condition they have a finite termination property. Finally, the methods are extended to solving the monotone affine variational inequality problem", "keyphrases": ["projection-type methods", "monotone LCP", "finite termination", "monotone linear complementarity problem", "extragradient method", "Newton method", "active set strategy", "linear system of equations", "iteration", "monotonicity", "convergence", "nondegeneracy condition", "monotone affine variational inequality problem", "matrix", "vector"]} -{"id": "1835", "title": "Establishing an urban digital cadastre: analytical reconstruction of parcel boundaries", "abstract": "A new method for generating a spatially accurate, legally supportive and operationally efficient cadastral database of the urban cadastral reality is described. The definition and compilation of an accurate cadastral database (achieving a standard deviation smaller than 0.1 m) is based on an analytical reconstruction of cadastral boundaries rather than on the conventional field reconstruction process. The new method is based on GPS control points and traverse networks for providing the framework; the old field books for defining the links between the various original ground features; and a geometrical and cadastral adjustment process as the conceptual basis. A pilot project that was carried out in order to examine and evaluate the new method is described", "keyphrases": ["urban digital cadastre", "analytical reconstruction", "parcel boundaries", "spatially accurate cadastral database", "urban cadastral reality", "standard deviation", "field reconstruction process", "GPS control points", "traverse networks", "old field books", "ground features", "cadastral adjustment process", "land information systems", "LIS", "geographic information systems"]} -{"id": "1870", "title": "Robust control of nonlinear systems with parametric uncertainty", "abstract": "Probabilistic robustness analysis and synthesis for nonlinear systems with uncertain parameters are presented. Monte Carlo simulation is used to estimate the likelihood of system instability and violation of performance requirements subject to variations of the probabilistic system parameters. Stochastic robust control synthesis searches the controller design parameter space to minimize a cost that is a function of the probabilities that design criteria will not be satisfied. The robust control design approach is illustrated by a simple nonlinear example. A modified feedback linearization control is chosen as controller structure, and the design parameters are searched by a genetic algorithm to achieve the tradeoff between stability and performance robustness", "keyphrases": ["robust control", "nonlinear systems", "parametric uncertainty", "probabilistic robustness analysis", "probabilistic robustness synthesis", "uncertain parameters", "Monte Carlo simulation", "system instability", "performance requirements violation", "stochastic control synthesis", "modified feedback linearization control", "genetic algorithm", "input-to-state stability"]} -{"id": "173", "title": "Stock market trading rule discovery using technical charting heuristics", "abstract": "In this case study in knowledge engineering and data mining, we implement a recognizer for two variations of the 'bull flag' technical charting heuristic and use this recognizer to discover trading rules on the NYSE Composite Index. Out-of-sample results indicate that these rules are effective", "keyphrases": ["stock market trading", "rule discovery", "technical charting heuristics", "financial expert system", "case study", "knowledge engineering", "data mining", "NYSE Composite Index", "out-of-sample results"]} -{"id": "1526", "title": "GK-DEVS: Geometric and kinematic DEVS formalism for simulation modeling of 3-dimensional multi-component systems", "abstract": "A combined discrete/continuous simulation methodology based on the DEVS (discrete event system specification) formalism is presented in this paper that satisfies the simulation requirements of 3-dimensional and dynamic systems with multi-components. We propose a geometric and kinematic DEVS (GK-DEVS) formalism that is able to describe the geometric and kinematic structure of a system and its continuous state dynamics as well as the interaction among the multi-components. To establish one model having dynamic behavior and a particular hierarchical structure, the atomic and the coupled model of the conventional DEVS are merged into one model in the proposed formalism. For simulation of the continuous motion of 3-D components, the sequential state set is partitioned into the discrete and the continuous state set and the rate of change function over the continuous state set is employed. Although modified from the conventional DEVS formalism, the GK-DEVS formalism preserves a hierarchical, modular modeling fashion and a coupling scheme. Furthermore, for the GK-DEVS model simulation, we propose an abstract simulation algorithm, called a GK-Simulator, in which data and control are separated and events are scheduled not globally but hierarchically so that an object-oriented principle is satisfied. The proposed GK-DEVS formalism and the GK-Simulator algorithm have been applied to the simulation of a flexible manufacturing system consisting of a 2-axis lathe, a 3-axis milling machine, and a vehicle-mounted robot", "keyphrases": ["GK-DEVS", "kinematic DEVS", "geometric DEVS", "simulation modeling", "3 dimensional multi-component systems", "combined discrete/continuous simulation methodology", "simulation requirements", "continuous state dynamics", "dynamic behavior", "continuous motion", "sequential state set", "abstract simulation algorithm", "GK-Simulator", "object-oriented principle", "flexible manufacturing system", "2-axis lathe", "3-axis milling machine", "vehicle-mounted robot"]} -{"id": "1563", "title": "A distance between elliptical distributions based in an embedding into the Siegel group", "abstract": "This paper describes two different embeddings of the manifolds corresponding to many elliptical probability distributions with the informative geometry into the manifold of positive-definite matrices with the Siegel metric, generalizing a result published previously elsewhere. These new general embeddings are applicable to a wide class of elliptical probability distributions, in which the normal, t-Student and Cauchy are specific examples. A lower bound for the Rao distance is obtained, which is itself a distance, and, through these embeddings, a number of statistical tests of hypothesis are derived", "keyphrases": ["elliptical distributions", "Siegel group", "manifolds embeddings", "informative geometry", "positive-definite matrices", "elliptical probability distributions", "lower bound"]} -{"id": "1627", "title": "Blind identification of non-stationary MA systems", "abstract": "A new adaptive algorithm for blind identification of time-varying MA channels is derived. This algorithm proposes the use of a novel system of equations derived by combining the third- and fourth-order statistics of the output signals of MA models. This overdetermined system of equations has the important property that it can be solved adaptively because of their symmetries via an overdetermined recursive instrumental variable-type algorithm. This algorithm shows good behaviour in arbitrary noisy environments and good performance in tracking time-varying systems", "keyphrases": ["blind identification", "time-varying channels", "nonstationary systems", "adaptive algorithm", "fourth-order statistics", "third-order statistics", "MA models", "overdetermined recursive algorithm", "recursive instrumental variable algorithm", "arbitrary noisy environments", "tracking", "iterative algorithms", "additive Gaussian noise", "higher-order statistics"]} -{"id": "1811", "title": "Adaptive tracking controller design for robotic systems using Gaussian wavelet networks", "abstract": "An adaptive tracking control design for robotic systems using Gaussian wavelet networks is proposed. A Gaussian wavelet network with accurate approximation capability is employed to approximate the unknown dynamics of robotic systems by using an adaptive learning algorithm that can learn the parameters of the dilation and translation of Gaussian wavelet functions. Depending on the finite number of wavelet basis functions which result in inevitable approximation errors, a robust control law is provided to guarantee the stability of the closed-loop robotic system that can be proved by Lyapunov theory. Finally, the effectiveness of the Gaussian wavelet network-based control approach is illustrated through comparative simulations on a six-link robot manipulator", "keyphrases": ["adaptive tracking controller design", "robotic systems", "Gaussian wavelet networks", "accurate approximation capability", "unknown dynamics", "adaptive learning algorithm", "approximation errors", "robust control law", "closed-loop system", "Lyapunov theory", "six-link robot manipulator"]} -{"id": "1483", "title": "Hypothesis-based concept assignment in software maintenance", "abstract": "Software maintenance accounts for a significant proportion of the lifetime cost of a software system. Software comprehension is required in many parts of the maintenance process and is one of the most expensive activities. Many tools have been developed to help the maintainer reduce the time and cost of this task, but of the numerous tools and methods available one group has received relatively little attention: those using plausible reasoning to address the concept assignment problem. We present a concept assignment method for COBOL II: hypothesis-based concept assignment (HB-CA). An implementation of a prototype tool is described, and results from a comprehensive evaluation using commercial COBOL II sources are summarised. In particular, we identify areas of a standard maintenance process where such methods would be appropriate, and discuss the potential cost savings that may result", "keyphrases": ["hypothesis-based concept assignment", "software maintenance", "lifetime cost", "COBOL II", "scalability"]} -{"id": "1854", "title": "Software Technology: looking for quality accountants", "abstract": "Software Technology wants to turn 23 years of reselling experience in the legal business into an asset in the accounting market", "keyphrases": ["Software Technology", "reselling", "accounting market"]} -{"id": "1782", "title": "Exploring the sabbatical or other leave as a means of energizing a career", "abstract": "This article challenges librarians to create leaves that will not only inspire professional growth but also renewal. It presents a framework for developing a successful leave, incorporating useful advice from librarians at Concordia University (Montreal). As food for thought, the article offers examples of specific options meant to encourage professionals to explore their own creative ideas. Finally, a central theme of this article is that a midlife leave provides one with the perfect opportunity to take stock of oneself in order to define future career directions. Midlife is a time when rebel forces, feisty protestors from within, often insist on being heard. It is a time, in other words, when professionals often long to break loose from the stress \"to do far more, in less time\" (Barner, 1994). Escaping from current job constraints into a world of creative endeavor, when well-executed, is a superb means of invigorating a career stuck in gear and discovering a fresh perspective from which to view one's profession. To ignite renewal, midcareer is the perfect time to grant one's imagination free reign", "keyphrases": ["sabbatical leave", "career", "librarians", "professional growth", "library staff", "midlife leave"]} -{"id": "1894", "title": "Switching controller design via convex polyhedral Lyapunov functions", "abstract": "We propose a systematic switching control design method for a class of nonlinear discrete time hybrid systems. The novelty of the adopted approach is in the fact that unlike conventional control the control burden is shifted to a logical level thus creating the need for the development of new analysis/design methods", "keyphrases": ["switching controller design", "convex polyhedral Lyapunov functions", "nonlinear discrete time hybrid systems", "systematic design method"]} -{"id": "1869", "title": "Stability and L/sub 2/ gain properties of LPV systems", "abstract": "Stability and L/sub 2/ gain properties of linear parameter-varying systems are obtained under assumed bounds on either the maximum or average value of the parameter rate", "keyphrases": ["stability", "L/sub 2/ gain properties", "linear parameter-varying systems", "parameter rate", "Gromwall-Bellman inequality", "gain scheduled control"]} -{"id": "1742", "title": "A sufficient condition for optimality in nondifferentiable invex programming", "abstract": "A sufficient optimality condition is established for a nonlinear programming problem without differentiability assumption on the data wherein Clarke's (1975) generalized gradient is used to define invexity", "keyphrases": ["nondifferentiable invex programming", "sufficient optimality condition", "nonlinear programming problem", "generalized gradient", "invexity", "locally Lipschitz function", "semiconvex function"]} -{"id": "1707", "title": "Tactical airborne reconnaissance goes dual-band and beyond", "abstract": "Multispectral imaging technologies are satisfying the need for a \"persistent\" look at the battlefield. We highlight the need to persistently monitor a battlefield to determine exactly who and what is there. For example, infrared imaging can be used to expose the fuel status of an aircraft on the runway. A daytime, visible-spectrum image of the same aircraft would offer information about external details, such as the plane's markings and paint scheme. A dual-band camera enables precision image registration by fusion and frequently yields more information than is possible by evaluating the images separately", "keyphrases": ["tactical airborne reconnaissance", "multispectral imaging technologies", "battlefield", "infrared imaging", "fuel status", "aircraft", "daytime visible-spectrum image", "dual-band camera", "precision image registration", "sensor fusion"]} -{"id": "1740", "title": "Verification of ideological classifications-a statistical approach", "abstract": "The paper presents a statistical method of verifying ideological classifications of votes. Parliamentary votes, preclassified by an expert (on a chosen subset), are verified at an assumed significance level by seeking the most likely match with the actual vote results. Classifications that do not meet the requirements defined are rejected. The results obtained can be applied in the ideological dimensioning algorithms, enabling ideological identification of dimensions obtained", "keyphrases": ["ideological classifications", "statistical approach", "parliamentary votes", "significance level", "ideological dimensioning algorithms", "ideological space", "bootstrap"]} -{"id": "1705", "title": "The use of visual search for knowledge gathering in image decision support", "abstract": "This paper presents a new method of knowledge gathering for decision support in image understanding based on information extracted from the dynamics of saccadic eye movements. The framework involves the construction of a generic image feature extraction library, from which the feature extractors that are most relevant to the visual assessment by domain experts are determined automatically through factor analysis. The dynamics of the visual search are analyzed by using the Markov model for providing training information to novices on how and where to look for image features. The validity of the framework has been evaluated in a clinical scenario whereby the pulmonary vascular distribution on Computed Tomography images was assessed by experienced radiologists as a potential indicator of heart failure. The performance of the system has been demonstrated by training four novices to follow the visual assessment behavior of two experienced observers. In all cases, the accuracy of the students improved from near random decision making (33%) to accuracies ranging from 50% to 68%", "keyphrases": ["pulmonary vascular distribution", "experienced radiologists", "heart failure indicator", "visual assessment behavior", "experienced observers", "student accuracy", "Markov model", "training information", "image features", "domain experts", "saccadic eye movements dynamics", "near random decision making", "medical diagnostic imaging"]} -{"id": "1896", "title": "The dynamics of a railway freight wagon wheelset with dry friction damping", "abstract": "We investigate the dynamics of a simple model of a wheelset that supports one end of a railway freight wagon by springs with linear characteristics and dry friction dampers. The wagon runs on an ideal, straight and level track with constant speed. The lateral dynamics in dependence on the speed is examined. We have included stick-slip and hysteresis in our model of the dry friction and assume that Coulomb's law holds during the slip phase. It is found that the action of dry friction completely changes the bifurcation diagram, and that the longitudinal component of the dry friction damping forces destabilizes the wagon", "keyphrases": ["dynamics", "railway freight wagon wheelset", "dry friction damping", "linear characteristics", "lateral dynamics", "stick-slip", "hysteresis", "Coulomb law", "bifurcation diagram", "longitudinal component"]} -{"id": "1519", "title": "Structural invariance of spatial Pythagorean hodographs", "abstract": "The structural invariance of the four-polynomial characterization for three-dimensional Pythagorean hodographs introduced by Dietz et al. (1993), under arbitrary spatial rotations, is demonstrated. The proof relies on a factored-quaternion representation for Pythagorean hodographs in three-dimensional Euclidean space-a particular instance of the \"PH representation map\" proposed by Choi et al. (2002)-and the unit quaternion description of spatial rotations. This approach furnishes a remarkably simple derivation for the polynomials u(t), upsilon (t), p(t), q(t) that specify the canonical form of a rotated Pythagorean hodograph, in terms of the original polynomials u(t), upsilon (t), p(t), q(t) and the angle theta and axis n of the spatial rotation. The preservation of the canonical form of PH space curves under arbitrary spatial rotations is essential to their incorporation into computer-aided design and manufacturing applications, such as the contour machining of free-form surfaces using a ball-end mill and realtime PH curve CNC interpolators", "keyphrases": ["structural invariance", "four-polynomial characterization", "spatial Pythagorean hodographs", "3D Pythagorean hodographs", "arbitrary spatial rotations", "factored quaternion representation", "3D Euclidean space", "PH representation map", "unit quaternion description", "spatial rotations", "CAD/CAM", "contour machining", "free-form surfaces", "ball-end mill", "real-time PH curve CNC interpolators"]} -{"id": "1618", "title": "Optimal learning for patterns classification in RBF networks", "abstract": "The proposed modifying of the structure of the radial basis function (RBF) network by introducing the weight matrix to the input layer (in contrast to the direct connection of the input to the hidden layer of a conventional RBF) so that the training space in the RBF network is adaptively separated by the resultant decision boundaries and class regions is reported. The training of this weight matrix is carried out as for a single-layer perceptron together with the clustering process. In this way the network is capable of dealing with complicated problems, which have a high degree of interference in the training data, and achieves a higher classification rate over the current classifiers using RBF", "keyphrases": ["pattern classification", "optimal learning", "RBF networks", "radial basis function network", "weight matrix training", "input layer", "training space", "decision boundaries", "class regions", "single-layer perceptron", "clustering process", "classification rate improvement"]} -{"id": "1625", "title": "Use of fuzzy weighted autocorrelation function for pitch extraction from noisy speech", "abstract": "An investigation is presented into the feasibility of incorporating a fuzzy weighting scheme into the calculation of an autocorrelation function for pitch extraction. Simulation results reveal that the proposed method provides better robustness against background noise than the conventional approaches for extracting pitch period in a noisy environment", "keyphrases": ["pitch extraction", "noisy speech", "fuzzy weighting scheme", "autocorrelation function", "simulation results", "background noise", "speech analysis-synthesis system", "average magnitude difference function", "cepstrum method"]} -{"id": "1660", "title": "A regularized conjugate gradient method for symmetric positive definite system of linear equations", "abstract": "A class of regularized conjugate gradient methods is presented for solving the large sparse system of linear equations of which the coefficient matrix is an ill-conditioned symmetric positive definite matrix. The convergence properties of these methods are discussed in depth, and the best possible choices of the parameters involved in the new methods are investigated in detail. Numerical computations show that the new methods are more efficient and robust than both classical relaxation methods and classical conjugate direction methods", "keyphrases": ["regularized conjugate gradient method", "symmetric positive definite system", "linear equations", "large sparse system", "coefficient matrix", "convergence properties", "classical relaxation methods", "classical conjugate direction methods", "ill-conditioned linear system"]} -{"id": "171", "title": "Education, training and development policies and practices in medium-sized companies in the UK: do they really influence firm performance?", "abstract": "This paper sets out to examine the relationship between training and firm performance in middle-sized UK companies. It recognises that there is evidence that \"high performance work practices\" appear to be associated with better performance in large US companies, but argues that this relationship is less likely to be present in middle-sized companies. The paper's key contribution is to justify the wider concept of education, training and development (ETD) as applicable to such companies. It then finds that clusters of some ETD variables do appear to be associated with better middle-sized company performance", "keyphrases": ["medium-sized UK companies", "training", "firm performance", "education", "development policies", "high performance work practices", "ETD variable clusters", "human resources"]} -{"id": "1524", "title": "Organizational design, information transfer, and the acquisition of rent-producing resources", "abstract": "Within the resource-based view of the firm, a dynamic story has emerged in which the knowledge accumulated over the history of a firm and embedded in organizational routines and structures influences the firm's ability to recognize the value of new resources and capabilities. This paper explores the possibility of firms to select organizational designs that increase the likelihood that they will recognize and value rent-producing resources and capabilities. A computational model is developed to study the tension between an organization's desire to explore its environment for new capabilities and the organization's need to exploit existing capabilities. Support is provided for the proposition that integration, both externally and internally, is an important source of dynamic capability. The model provides greater insight into the tradeoffs between these two forms of integration and suggests when one form may be preferred over another. In particular, evidence is provided that in uncertain environments, the ability to explore possible alternatives is critical while in more certain environments, the ability to transfer information internally is paramount", "keyphrases": ["organizational design", "information transfer", "rent-producing resources", "computational model", "uncertain environments", "probability", "certain environments", "social networks", "business strategy", "investments"]} -{"id": "1561", "title": "Self-validating integration and approximation of piecewise analytic functions", "abstract": "Let an analytic or a piecewise analytic function on a compact interval be given. We present algorithms that produce enclosures for the integral or the function itself. Under certain conditions on the representation of the function, this is done with the minimal order of numbers of operations. The integration algorithm is implemented and numerical comparisons to non-validating integration software are presented", "keyphrases": ["self-validating integration", "self-validating approximation", "compact interval", "enclosures", "minimal order", "integration algorithm", "complex interval arithmetic", "piecewise analytic functions"]} -{"id": "1780", "title": "Migrating to public librarianship: depart on time to ensure a smooth flight", "abstract": "Career change can be a difficult, time-consuming, and anxiety-laden process for anyone contemplating this important decision. The challenges faced by librarians considering the move from academic to public librarianship can be equally and significantly demanding. To most outsiders, at least on the surface, it may appear to be a quick and easy transition to make, but some professional librarians recognize the distinct differences between these areas of librarianship. Although the ubiquitous nature of technology has brought the various work responsibilities of academic and public librarians closer together during the last decade, there remain key differences in job-related duties and the work environments. These dissimilarities pose meaningful hurdles to leap for academic librarians wishing to migrate to the public sector. The paper considers the variations between academic and public librarianship", "keyphrases": ["public librarianship", "career change", "academic library", "public library", "professional librarians", "library technology", "work responsibilities", "job-related duties", "work environments"]} -{"id": "1738", "title": "Nurture the geek in you [accounting on the Internet]", "abstract": "When chartered accountants focus on IT, it's not simply because we think technology is neat. We keep on top of tech trends and issues because it helps us do our jobs well. We need to know how to best manage and implement the wealth of technology systems within out client base or employer, as well as to determine on an ongoing basis how evolving technologies might affect business strategies, threats and opportunities. One way to stay current with technology is by monitoring the online drumbeat. Imagine the Internet as an endless conversation of millions of chattering voices, each focusing on a multitude of topics and issues. It's not surprising that a great deal of the information relates to technology itself, and if you learn how to tune in to the drumbeat, you can keep yourself informed", "keyphrases": ["chartered accountants", "Internet", "information technology", "Slashdot", "Techdirt", "The Register", "Dan Gillmor's Wournal", "Daypop Top 40", "RISKS", "SecurityFocus", "TechWeb"]} -{"id": "1813", "title": "LMI approach to digital redesign of linear time-invariant systems", "abstract": "A simple design methodology for the digital redesign of static state feedback controllers by using linear matrix inequalities is presented. The proposed method provides close matching of the states between the original continuous-time system and those of the digitally redesigned system with a guaranteed stability. Specifically, the digital redesign problem is reformulated as linear matrix inequalities (LMIs) and solved by a numerical optimisation technique. The main feature of the proposed method is that the closed-loop stability of the digitally redesigned system is explicitly guaranteed within the design procedure using the LMI-based approach. A numerical example of the position control of a simple crane system is presented", "keyphrases": ["LMI approach", "digital redesign", "linear time-invariant systems", "design methodology", "linear matrix inequalities", "continuous-time system", "guaranteed stability", "numerical optimisation technique", "closed-loop stability", "position control", "crane system"]} -{"id": "1481", "title": "Impact of aviation highway-in-the-sky displays on pilot situation awareness", "abstract": "Thirty-six pilots (31 men, 5 women) were tested in a flight simulator on their ability to intercept a pathway depicted on a highway-in-the-sky (HITS) display. While intercepting and flying the pathway, pilots were required to watch for traffic outside the cockpit. Additionally, pilots were tested on their awareness of speed, altitude, and heading during the flight. Results indicated that the presence of a flight guidance cue significantly improved flight path awareness while intercepting the pathway, but significant practice effects suggest that a guidance cue might be unnecessary if pilots are given proper training. The amount of time spent looking outside the cockpit while using the HITS display was significantly less than when using conventional aircraft instruments. Additionally, awareness of flight information present on the HITS display was poor. Actual or potential applications of this research include guidance for the development of perspective flight display standards and as a basis for flight training requirements", "keyphrases": ["flight simulator", "pilots", "highway-in-the-sky display", "cockpit", "flight guidance", "human factors", "situation awareness", "flight path awareness", "aircraft display"]} -{"id": "1856", "title": "Tax forms: CD or not CD?", "abstract": "The move from CD to the Web looks unstoppable. Besides counting how many thousands of electronic tax forms they offer, vendors are rapidly moving those documents to the Web", "keyphrases": ["electronic tax forms", "Web", "ATX Forms Zillion Forms", "CCH Perform Plus H", "Kleinrock Forms Library Plus", "Nelco LaserLibrarian II", "RIA eForm", "STF Services Superform", "Universal Tax Systems Forms Complete"]} -{"id": "155", "title": "Fuzzy non-homogeneous Markov systems", "abstract": "In this paper the theory of fuzzy logic and fuzzy reasoning is combined with the theory of Markov systems and the concept of a fuzzy non-homogeneous Markov system is introduced for the first time. This is an effort to deal with the uncertainty introduced in the estimation of the transition probabilities and the input probabilities in Markov systems. The asymptotic behaviour of the fuzzy Markov system and its asymptotic variability is considered and given in closed analytic form. Moreover, the asymptotically attainable structures of the system are estimated also in a closed analytic form under some realistic assumptions. The importance of this result lies in the fact that in most cases the traditional methods for estimating the probabilities can not be used due to lack of data and measurement errors. The introduction of fuzzy logic into Markov systems represents a powerful tool for taking advantage of the symbolic knowledge that the experts of the systems possess", "keyphrases": ["fuzzy nonhomogeneous Markov systems", "fuzzy logic", "fuzzy reasoning", "uncertainty", "transition probabilities", "input probabilities", "asymptotic variability", "measurement errors", "symbolic knowledge", "probability theory"]} -{"id": "1500", "title": "DAML+OIL: an ontology language for the Semantic Web", "abstract": "By all measures, the Web is enormous and growing at a staggering rate, which has made it increasingly difficult-and important-for both people and programs to have quick and accurate access to Web information and services. The Semantic Web offers a solution, capturing and exploiting the meaning of terms to transform the Web from a platform that focuses on presenting information, to a platform that focuses on understanding and reasoning with information. To support Semantic Web development, the US Defense Advanced Research Projects Agency launched the DARPA Agent Markup Language (DAML) initiative to fund research in languages, tools, infrastructure, and applications that make Web content more accessible and understandable. Although the US government funds DAML, several organizations-including US and European businesses and universities, and international consortia such as the World Wide Web Consortium-have contributed to work on issues related to DAML's development and deployment. We focus on DAML's current markup language, DAML+OIL, which is a proposed starting point for the W3C's Semantic Web Activity's Ontology Web Language (OWL). We introduce DAML+OIL syntax and usage through a set of examples, drawn from a wine knowledge base used to teach novices how to build ontologies", "keyphrases": ["Semantic Web", "DARPA Agent Markup Language", "DAML+OIL", "Ontology Web Language", "syntax", "wine knowledge base"]} -{"id": "1545", "title": "Pontryagin maximum principle of optimal control governed by fluid dynamic systems with two point boundary state constraint", "abstract": "We study the optimal control problem subject to the semilinear equation with a state constraint. We prove certain theorems and give examples of state constraints so that the maximum principle holds. The main difficulty of the problem is to make the sensitivity analysis of the state with respect to the control caused by the unboundedness and nonlinearity of an operator", "keyphrases": ["Pontryagin maximum principle", "optimal control", "fluid dynamics", "semilinear equation", "state constraints"]} -{"id": "1601", "title": "Solving the multiple competitive facilities location problem", "abstract": "In this paper we propose five heuristic procedures for the solution of the multiple competitive facilities location problem. A franchise of several facilities is to be located in a trade area where competing facilities already exist. The objective is to maximize the market share captured by the franchise as a whole. We perform extensive computational tests and conclude that a two-step heuristic procedure combining simulated annealing and an ascent algorithm provides the best solutions", "keyphrases": ["multiple competitive facilities location problem", "heuristic procedures", "facilities franchise", "market share maximization", "computational tests", "two-step heuristic procedure", "simulated annealing", "ascent algorithm"]} -{"id": "1644", "title": "An experimental evaluation of comprehensibility aspects of knowledge structures derived through induction techniques: a case study of industrial fault diagnosis", "abstract": "Machine induction has been extensively used in order to develop knowledge bases for decision support systems and predictive systems. The extent to which developers and domain experts can comprehend these knowledge structures and gain useful insights into the basis of decision making has become a challenging research issue. This article examines the knowledge structures generated by the C4.5 induction technique in a fault diagnostic task and proposes to use a model of human learning in order to guide the process of making comprehensive the results of machine induction. The model of learning is used to generate hierarchical representations of diagnostic knowledge by adjusting the level of abstraction and varying the goal structures between 'shallow' and 'deep' ones. Comprehensibility is assessed in a global way in an experimental comparison where subjects are required to acquire the knowledge structures and transfer to new tasks. This method of addressing the issue of comprehensibility appears promising especially for machine induction techniques that are rather inflexible with regard to the number and sorts of interventions allowed to system developers", "keyphrases": ["experimental evaluation", "knowledge structure comprehensibility aspects", "induction techniques", "case study", "industrial fault diagnosis", "knowledge bases", "decision support systems", "predictive systems", "C4.5 induction technique", "industrial plants", "human learning model", "diagnostic knowledge representations"]} -{"id": "1837", "title": "A review of methodologies used in research on cadastral development", "abstract": "World-wide, much attention has been given to cadastral development. As a consequence of experiences made during recent decades, several authors have stated the need for research in the domain of cadastre and proposed methodologies to be used. The paper contributes to the acceptance of research methodologies needed for cadastral development, and thereby enhances theory in the cadastral domain. The paper reviews nine publications on cadastre and identifies the methodologies used. The review focuses on the institutional, social, political and economic aspects of cadastral development, rather than on the technical aspects. The main conclusion is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped by social, political and economic conditions, as well as technology. Since the geodetic survey profession has been the keeper of the cadastre, geodetic surveyors will have to deal ever more with social science matters, a fact that universities will have to consider", "keyphrases": ["cadastral development methodologies", "cadastre", "research methodologies", "political aspects", "economic aspects", "social sciences", "economic conditions", "geodetic survey profession", "geodetic surveyors", "land registration", "case study"]} -{"id": "1872", "title": "TPTP, CASC and the development of a semantically guided theorem prover", "abstract": "The first-order theorem prover SCOTT has been through a series of versions over some ten years. The successive provers, while retaining the same underlying technology, have used radically different algorithms and shown wide differences of behaviour. The development process has depended heavily on experiments with problems from the TPTP library and has been sharpened by participation in CASC each year since 1997. We outline some of the difficulties inherent in designing and refining a theorem prover as complex as SCOTT, and explain our experimental methodology. While SCOTT is not one of the systems which have been highly optimised for CASC, it does help to illustrate the influence of both CASC and the TPTP library on contemporary theorem proving research", "keyphrases": ["TPTP library", "Semantically Constrained Otter", "proof searches", "CASC", "semantically guided theorem prover", "first-order theorem prover", "SCOTT", "experimental methodology"]} -{"id": "1759", "title": "On the p-adic Birch, Swinnerton-Dyer Conjecture for non-semistable reduction", "abstract": "In this paper, we examine the Iwasawa theory of elliptic curves E with additive reduction at an odd prime p. By extending Perrin-Riou's theory to certain nonsemistable representations, we are able to convert Kato's zeta-elements into p-adic L-functions. This allows us to deduce the cotorsion of the Selmer group over the cyclotomic Z/sub p/-extension of Q, and thus prove an inequality in the p-adic Birch and Swinnerton-Dyer conjecture at primes p whose square divides the conductor of E", "keyphrases": ["p-adic Birch", "Swinnerton-Dyer conjecture", "nonsemistable reduction", "lwasawa theory", "elliptic curves", "additive reduction", "Perrin-Riou's theory", "p-adic L-functions", "cotorsion", "Selmer group", "cyclotomic Z/sub p/-extension"]} -{"id": "1799", "title": "Steady-state mean-square error analysis of the cross-correlation and constant modulus algorithm in a MIMO convolutive system", "abstract": "The cross-correlation and constant modulus algorithm (CC-CMA) has been proven to be an effective approach in the problem of joint blind equalisation and source separation in a multi-input and multi-output system. In the paper, the steady-state mean-square error performance of CC-CMA in a noise-free environment is studied, and a new expression is derived based on the energy preservation approach of Mai and Sayed (2000). Simulation studies are undertaken to support the analysis", "keyphrases": ["MIMO convolutive system", "Steady-state mean-square error analysis", "cross-correlation", "constant modulus algorithm", "joint blind equalisation", "source separation", "multi-input multi-output system", "noise-free environment", "energy preservation approach", "CC-CMA"]} -{"id": "1465", "title": "P systems with symport/antiport rules: the traces of objects", "abstract": "We continue the study of those P systems where the computation is performed by the communication of objects, that is, systems with symport and antiport rules. Instead of the (number of) objects collected in a specified membrane, as the result of a computation we consider the itineraries of a certain object through membranes, during a halting computation, written as a coding of the string of labels of the visited membranes. The family of languages generated in this way is investigated with respect to its place in the Chomsky hierarchy. When the (symport and antiport) rules are applied in a conditional manner, promoted or inhibited by certain objects which should be present in the membrane where a rule is applied, then a characterization of recursively enumerable languages is obtained; the power of systems with the rules applied freely is only partially described", "keyphrases": ["P systems", "object communication", "object traces", "antiport rules", "symport rules", "itineraries", "halting computation", "label string coding", "languages", "Chomsky hierarchy", "recursively enumerable languages"]} -{"id": "1498", "title": "John McCarthy: father of AI", "abstract": "If John McCarthy, the father of AI, were to coin a new phrase for \"artificial intelligence\" today, he would probably use \"computational intelligence.\" McCarthy is not just the father of AI, he is also the inventor of the Lisp (list processing) language. The author considers McCarthy's conception of Lisp and discusses McCarthy's recent research that involves elaboration tolerance, creativity by machines, free will of machines, and some improved ways of doing situation calculus", "keyphrases": ["John McCarthy", "father of AI", "artificial intelligence", "computational intelligence", "Lisp", "list processing", "elaboration tolerance", "creativity", "free will", "situation calculus"]} -{"id": "1764", "title": "Two-scale curved element method for elliptic problems with small periodic coefficients", "abstract": "This paper is concerned with the second order elliptic problems with small periodic coefficients on a bounded domain with a curved boundary. A two-scale curved element method which couples linear elements and isoparametric elements is proposed. The error estimate is obtained over the given smooth domain. Furthermore an additive Schwarz method is provided for the isoparametric element method", "keyphrases": ["two-scale curved element method", "elliptic problems", "small periodic coefficients", "second order elliptic problems", "bounded domain", "curved boundary", "linear elements", "isoparametric elements", "error estimate", "additive Schwarz method", "isoparametric element method"]} -{"id": "1721", "title": "Dueling platforms [healthcare network servers]", "abstract": "Many large hospitals and healthcare systems have grown accustomed to the reliability of mainframe architecture, although tighter operating budgets, coupled with advances in client/server technology, have led to more office and clinical applications being moved off mainframes. But Evanston Northwestern Healthcare wasn't ready to get rid of its IBM OS 390 mainframe just yet. While a number of new clinical applications are being installed on two brand new IBM servers, Evanston Northwestern Healthcare will retain its favored hospital billing system and let it reside on the organization's mainframe, as it has since 1982", "keyphrases": ["network servers", "Evanston Northwestern Healthcare", "IBM OS 390 mainframe", "Leapfrog Group", "computerized physician order entry system"]} -{"id": "1917", "title": "Design and modeling of an interval-based ABR flow control protocol", "abstract": "A novel flow control protocol is presented for availability bit rate (ABR) service in asynchronous transfer mode (ATM) networks. This scheme features periodic explicit rate feedback that enables precise allocation of link bandwidth and buffer space on a hop-by-hop basis to guarantee maximum throughput, minimum cell loss, and high resource efficiency. With the inclusion of resource management cell synchronization and consolidation algorithms, this protocol is capable of controlling point-to-multipoint ABR services within a unified framework. The authors illustrate the modeling of single ABR connection, the interaction between multiple ABR connections, and the constraints applicable to flow control decisions. A loss-free flow control mechanism is presented for high-speed ABR connections using a fluid traffic model. Supporting algorithms and ATM signaling procedures are specified, in company with linear system modeling, numerical analysis, and simulation results, which demonstrate its performance and cost benefits in high-speed backbone networking scenarios", "keyphrases": ["interval-based ABR flow control protocol", "modeling", "design", "availability bit rate service", "ATM networks", "periodic explicit rate feedback", "link bandwidth allocation", "buffer space allocation", "maximum throughput", "minimum cell loss", "high resource efficiency", "resource management cell synchronization algorithms", "resource management cell consolidation algorithms", "point-to-multipoint services", "flow control decisions", "loss-free flow control mechanism", "high-speed ABR connections", "fluid traffic model", "signaling", "linear system modeling", "numerical analysis", "simulation", "high-speed backbone networking scenarios"]} -{"id": "1679", "title": "Project scheduling under time dependent costs-a branch and bound algorithm", "abstract": "In a given project network, execution of each activity in normal duration requires utilization of certain resources. If faster execution of an activity is desired then additional resources at extra cost would be required. Given a project network, the cost structure for each activity and a planning horizon, the project compression problem is concerned with the determination of optimal schedule of performing each activity while satisfying given restrictions and minimizing the total cost of project execution. The paper considers the project compression problem with time dependent cost structure for each activity. The planning horizon is divided into several regular time intervals over which the cost structure of an activity may vary. But the cost structure of the activities remains the same within a time interval. The objective is to find an optimal project schedule minimizing the total project cost. We present a mathematical model for this problem, develop some heuristics and an exact branch and bound algorithm. Using simulated problems we provide an insight into the computational performances of heuristics and the branch and bound algorithm", "keyphrases": ["project scheduling", "time dependent costs", "branch and bound algorithm", "project network", "planning horizon", "project compression problem", "optimal schedule", "heuristics"]} -{"id": "1684", "title": "E-learning on the college campus: a help or hindrance to students learning objectives: a case study", "abstract": "If you know how to surf the World Wide Web, have used email before, and can learn how to send an email attachment, then learning how to interact in an online course should not be difficult at all. In a way to find out, I decided to offer two identical courses, one of which would be offered online and the other the \"traditional way\". I wanted to see how students would fare with identical material provided in each course. I wanted their anonymous feedback, when the course was over", "keyphrases": ["distance education", "William Paterson University", "e-learning"]} -{"id": "168", "title": "Nurturing clients' trust to encourage engagement success during the customization of ERP systems", "abstract": "Customization is a crucial, lengthy, and costly aspect in the successful implementation of ERP systems, and has, accordingly, become a major specialty of many vendors and consulting companies. The study examines how such companies can increase their clients' perception of engagement success through increased client trust that is brought about through responsive and dependable customization. Survey data from ERP customization clients show that, as hypothesized, clients' trust influenced their perception of engagement success with the company. The data also show that clients' trust in the customization company was increased when the company behaved in accordance with client expectations by being responsive, and decreased when the company behaved in a manner that contradicted these expectations by not being dependable. Responses to an open-ended question addendum attached to the survey corroborated the importance of responsiveness and dependability. Implications for customization companies and research on trust are discussed", "keyphrases": ["client trust", "engagement success", "customization", "ERP systems", "enterprise resource planning systems", "vendors", "consulting companies", "perceived responsiveness", "MRP II implementation", "integrity", "benevolence", "dependability"]} -{"id": "1578", "title": "Records role in e-business", "abstract": "Records management standards are now playing a key role in e-business strategy", "keyphrases": ["e-business strategy", "records management"]} -{"id": "1829", "title": "Improved approximation of Max-Cut on graphs of bounded degree", "abstract": "Let alpha approximately=0.87856 denote the best approximation ratio currently known for the Max-Cut problem on general graphs. We consider a semidefinite relaxation of the Max-Cut problem, round it using the random hyperplane rounding technique of M.X. Goemans and D.P. Williamson (1995), and then add a local improvement step. We show that for graphs of degree at most Delta , our algorithm achieves an approximation ratio of at least alpha + epsilon , where epsilon >0 is a constant that depends only on Delta .. Using computer assisted analysis, we show that for graphs of maximal degree 3 our algorithm obtains an approximation ratio of at least 0.921, and for 3-regular graphs the approximation ratio is at least 0.924. We note that for the semidefinite relaxation of Max-Cut used by Goemans and Williamson the integrality gap is at least 1/0.885, even for 2-regular graphs", "keyphrases": ["Max-Cut approximation", "semidefinite relaxation", "approximation ratio", "computer assisted analysis", "2-regular graphs", "bounded degree graph", "best approximation ratio"]} -{"id": "1702", "title": "Reconstruction of time-varying 3-D left-ventricular shape from multiview X-ray cineangiocardiograms", "abstract": "This paper reports on the clinical application of a system for recovering the time-varying three-dimensional (3-D) left-ventricular (LV) shape from multiview X-ray cineangiocardiograms. Considering that X-ray cineangiocardiography is still commonly employed in clinical cardiology and computational costs for 3-D recovery and visualization are rapidly decreasing, it is meaningful to develop a clinically applicable system for 3-D LV shape recovery from X-ray cineangiocardiograms. The system is based on a previously reported closed-surface method of shape recovery from two-dimensional occluding contours with multiple views. To apply the method to \"real\" LV cineangiocardiograms, user-interactive systems were implemented for preprocessing, including detection of LV contours, calibration of the imaging geometry, and setting of the LV model coordinate system. The results for three real LV angiographic image sequences are presented, two with fixed multiple views (using supplementary angiography) and one with rotating views. 3-D reconstructions utilizing different numbers of views were compared and evaluated in terms of contours manually traced by an experienced radiologist. The performance of the preprocesses was also evaluated, and the effects of variations in user-specified parameters on the final 3-D reconstruction results were shown to be sufficiently small. These experimental results demonstrate the potential usefulness of combining multiple views for 3-D recovery from \"real\" LV cineangiocardiograms", "keyphrases": ["medical diagnostic imaging", "time-varying 3-D left-ventricular shape reconstruction", "multiview X-ray cineangiocardiograms", "clinical cardiology", "two-dimensional occluding contours", "arterial septal defect", "B-spline", "computational costs", "user-interactive systems", "angiographic image sequences", "fixed multiple views", "experienced radiologist", "user-specified parameters variations"]} -{"id": "1747", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. II. Laminate theory and applications", "abstract": "For pt. I see ibid., pp. 1159-76. The two papers report systematically a constitutive description for the inelastic and strength behavior of laminated composites reinforced with various fiber preforms. The constitutive relationship is established micromechanically, through layer-by-layer analysis. Namely, only the properties of the constituent fiber and matrix materials of the composites are required as input data. In the previous part lamina theory was presented. Three fundamental quantities of the laminae, i.e. the internal stresses generated in the constituent fiber and matrix materials and the instantaneous compliance matrix, with different fiber preform (including woven, braided, and knitted fabric) reinforcements were explicitly obtained by virtue of the bridging micromechanics model. In this paper, the laminate stress analysis is shown. The purpose of this analysis is to determine the load shared by each lamina in the laminate, so that the lamina theory can be applied. Incorporation of the constitutive equations into an FEM software package is illustrated. A number of application examples are given to demonstrate the efficiency of the constitutive theory. The predictions made include: failure envelopes of multidirectional laminates subjected to biaxial in-plane loads, thermomechanical cycling stress-strain curves of a titanium metal matrix composite laminate, S-N curves of multilayer knitted fabric reinforced laminates under tensile fatigue, and bending load-deflection plots and ultimate bending strengths of laminated braided fabric reinforced beams subjected to lateral loads", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "laminate theory", "strength behavior", "composites", "fiber preforms", "micromechanics", "layer-by-layer analysis", "internal stresses", "matrix materials", "instantaneous compliance matrix", "stress analysis", "load", "FEM software package", "failure envelopes", "multidirectional laminates", "biaxial in-plane loads", "thermomechanical cycling stress-strain curves", "titanium metal matrix composite laminate", "S-N curves", "multilayer knitted fabric reinforced laminates", "tensile fatigue", "bending load deflection plots", "ultimate bending strengths", "laminated braided fabric reinforced beams", "lateral loads"]} -{"id": "1891", "title": "On trajectory and force tracking control of constrained mobile manipulators with parameter uncertainty", "abstract": "Studies the trajectory and force tracking control problem of mobile manipulators subject to holonomic and nonholonomic constraints with unknown inertia parameters. Adaptive controllers are proposed based on a suitable reduced dynamic model, the defined reference signals and the mixed tracking errors. The proposed controllers not only ensure the entire state of the system to asymptotically converge to the desired trajectory but also ensure the constraint force to asymptotically converge to the desired force. A detailed numerical example is presented to illustrate the developed methods", "keyphrases": ["trajectory control", "force tracking control", "constrained mobile manipulators", "parameter uncertainty", "holonomic constraints", "nonholonomic constraints", "adaptive controllers", "reduced dynamic model", "mixed tracking errors", "asymptotic convergence", "position control", "mobile robots"]} -{"id": "19", "title": "Decentralized adaptive output feedback stabilization for a class of interconnected systems with unknown bound of uncertainties", "abstract": "The problem of adaptive decentralized stabilization for a class of linear time-invarying large-scale systems with nonlinear interconnectivity and uncertainties is discussed. The bounds of uncertainties are assumed to be unknown. For such uncertain dynamic systems, an adaptive decentralized controller is presented. The resulting closed-loop systems are asymptotically stable in theory. Moreover, an adaptive decentralized control scheme is given. The scheme ensures the closed-loop systems exponentially practically stable and can be used in practical engineering. Finally, simulations show that the control scheme is effective", "keyphrases": ["adaptive decentralized stabilization", "closed-loop systems", "uncertain dynamic systems", "robust control", "large scale systems"]} -{"id": "1909", "title": "Breast MR imaging with high spectral and spatial resolutions: preliminary experience", "abstract": "The authors evaluated magnetic resonance (MR) imaging with high spectral and spatial resolutions (HSSR) of water and fat in breasts of healthy volunteers (n=6) and women with suspicious lesions (n=6). Fat suppression, edge delineation, and image texture were improved on MR images derived from HSSR data compared with those on conventional MR images. HSSR MR imaging data acquired before and after contrast medium injection showed spectrally inhomogeneous changes in the water resonances in small voxels that were not detectable with conventional MR imaging", "keyphrases": ["breast magnetic resonance imaging", "high spectral spatial resolutions", "healthy volunteers", "edge delineation", "image texture", "magnetic resonance images", "magnetic resonance imaging data", "contrast medium injection", "water resonances", "small voxels", "women", "suspicious lesions", "fat suppression"]} -{"id": "1667", "title": "Combining constraint programming and linear programming on an example of bus driver scheduling", "abstract": "Provides details of a successful application where the column generation algorithm was used to combine constraint programming and linear programming. In the past, constraint programming and linear programming were considered to be two competing technologies that solved similar types of problems. Both these technologies had their strengths and weaknesses. The paper shows that the two technologies can be combined together to extract the strengths of both these technologies. Details of a real-world application to optimize bus driver duties are given. This system was developed by ILOG for a major software house in Japan using ILOG-Solver and ILOG-CPLEX, constraint programming and linear programming C/C++ libraries", "keyphrases": ["constraint programming", "linear programming", "bus driver scheduling", "column generation algorithm", "ILOG", "ILOG-Solver", "ILOG-CPLEX", "C/C++ libraries"]} -{"id": "1622", "title": "Error resilient intra refresh scheme for H.26L stream", "abstract": "Recently much attention has been focused on video streaming through IP-based networks. An error resilient RD intra macro-block refresh scheme for H.26L Internet video streaming is introduced. Various channel simulations have proved that this scheme is more effective than those currently adopted in H.26L", "keyphrases": ["H.26L video streaming", "Internet", "IP-based networks", "error resilient scheme", "intra macro-block refresh scheme", "channel simulations", "RD intra refresh scheme", "video communication", "RDerr scheme", "RDall scheme"]} -{"id": "176", "title": "Knowledge model reuse: therapy decision through specialisation of a generic decision model", "abstract": "We present the definition of the therapy decision task and its associated Heuristic Multi-Attribute (HM) solving method, in the form of a KADS-style specification. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature constituting a global therapy-evaluation framework in which considerations such as patient preferences and quality-of-life results are integrated. We give a high-level overview of this task as a specialisation of the generic decision task, and additional decomposition methods for the subtasks involved. These subtasks possess some reflective capabilities for reasoning about self-models, particularly the learning subtask, which incrementally corrects and refines the model used to assess the effects of the therapies. This work illustrates the process of reuse in the framework of AI software development methodologies such as KADS-CommonKADS in order to obtain new (more specialised but still generic) components for the analysis libraries developed in this context. In order to maximise reuse benefits, where possible, the therapy decision task and HM method have been defined in terms of regular components from the earlier-mentioned libraries. To emphasise the importance of using a rigorous approach to the modelling of domain and method ontologies, we make extensive use of the semi-formal object-oriented analysis notation UML, together with its associated constraint language OCL, to illustrate the ontology of the decision method and the corresponding specific one of the therapy decision domain, the latter being a refinement via inheritance of the former", "keyphrases": ["knowledge model reuse", "therapy decision task", "KADS-style specification", "global therapy-evaluation framework", "patient preferences", "reasoning", "learning subtask", "software development methodologies", "CommonKADS", "ontologies", "object-oriented analysis notation", "UML", "constraint language", "OCL", "generic decision model specialisation", "Heuristic Multi-Attribute solving method"]} -{"id": "1566", "title": "A numerical C/sup 1/-shadowing result for retarded functional differential equations", "abstract": "This paper gives a numerical C/sup 1/-shadowing between the exact solutions of a functional differential equation and its numerical approximations. The shadowing result is obtained by comparing exact solutions with numerical approximation which do not share the same initial value. Behavior of stable manifolds of functional differential equations under numerics will follow from the shadowing result", "keyphrases": ["numerical C/sup 1/-shadowing", "exact solutions", "numerical approximations", "stable manifolds", "retarded functional differential equations"]} -{"id": "1523", "title": "Process specialization: defining specialization for state diagrams", "abstract": "A precise definition of specialization and inheritance promises to be as useful in organizational process modeling as it is in object modeling. It would help us better understand, maintain, reuse, and generate process models. However, even though object-oriented analysis and design methodologies take full advantage of the object specialization hierarchy, the process specialization hierarchy is not supported in major process representations, such as the state diagram, data flow diagram, and UML representations. Partly underlying this lack of support is an implicit assumption that we can always specialize a process by treating it as \"just another object.\" We argue in this paper that this is not so straightforward as it might seem; we argue that a process-specific approach must be developed. We propose such an approach in the form of a set of transformations which, when applied to a process description, always result in specialization. We illustrate this approach by applying it to the state diagram representation and demonstrate that this approach to process specialization is not only theoretically possible, but shows promise as a method for categorizing and analyzing processes. We point out apparent inconsistencies between our notion of process specialization and existing work on object specialization but show that these inconsistencies are superficial and that the definition we provide is compatible with the traditional notion of specialization", "keyphrases": ["process specialization", "state diagrams", "inheritance", "organizational process modeling", "object-oriented analysis", "object specialization hierarchy", "process representation", "object-oriented design"]} -{"id": "1787", "title": "The theory of information reversal", "abstract": "The end of the industrial age coincides with the advent of the information society as the next model of social and economic organization, which brings about significant changes in the way modern man conceives work and the social environment. The functional basis of the new model is pivoted upon the effort to formulate the theory on the violent reversal of the basic relationship between man and information, and isolate it as one of the components for the creation of the new electronic reality. The objective of the theory of reversal is to effectively contribute to the formulation of a new definition consideration in regards to the concept of the emerging information society. In order to empirically apply the theory of reversal, we examine a case study based on the example of the digital library", "keyphrases": ["information reversal theory", "information society", "industrial age", "social organization", "economic organization", "case study", "digital library", "information systems"]} -{"id": "1851", "title": "Supporting global user profiles through trusted authorities", "abstract": "Personalization generally refers to making a Web site more responsive to the unique and individual needs of each user. We argue that for personalization to work effectively, detailed and interoperable user profiles should be globally available for authorized sites, and these profiles should dynamically reflect changes in user interests. Creating user profiles from user click-stream data seems to be an effective way of generating detailed and dynamic user profiles. However, a user profile generated in this way is available only on the computer where the user accesses his browser, and is inaccessible when the same user works on a different computer. On the other hand, integration of the Internet with telecommunication networks has made it possible for the users to connect to the Web with a variety of mobile devices as well as desktops. This requires that user profiles should be available to any desktop or mobile device on the Internet that users choose to work with. In this paper, we address these problems through the concept of \"trusted authority\". A user agent at the client side that captures the user click stream, dynamically generates a navigational history 'log' file in Extensible Markup Language (XML). This log file is then used to produce 'user profiles' in a resource description framework (RDF). A user's right to privacy is provided through the Platform for Privacy Preferences (P3P) standard. User profiles are uploaded to the trusted authority and served next time the user connects to the Web", "keyphrases": ["global user profiles", "trusted authorities", "personalization", "Web site", "Internet", "telecommunication networks", "mobile device", "user agent", "user click stream", "navigational history log file", "XML", "resource description framework", "privacy", "Platform for Privacy Preferences standard", "namespace qualifier", "globally unique user ID/password identification"]} -{"id": "1814", "title": "Control of integral processes with dead-time. 2. Quantitative analysis", "abstract": "For part 1, see ibid., p.285-90, (2002). Several different control schemes for integral processes with dead time resulted in the same disturbance response. It has already been shown that such a response is subideal. Hence, it is necessary to quantitatively analyse the achievable specifications and the robust stability regions. The control parameter can be quantitatively determined with a compromise between the disturbance response and the robustness. Four specifications: (normalised) maximum dynamic error, maximum decay rate, (normalised) control action bound and approximate recovery time are used to characterise the step-disturbance response. It is shown that any attempt to obtain a (normalised) dynamic error less than tau /sub m/ is impossible and a sufficient condition on the (relative) gain-uncertainty bound is square root (3)/2", "keyphrases": ["integral processes", "dead-time", "quantitative analysis", "disturbance response", "robust stability regions", "robustness", "maximum dynamic error", "maximum decay rate", "control action bound", "approximate recovery time", "step-disturbance response", "sufficient condition", "gain-uncertainty bound"]} -{"id": "1486", "title": "Hand-held digital video-camera for eye examination and follow-up", "abstract": "We developed a hand-held digital colour video-camera for eye examination in primary care. The device weighed 550 g. It featured a charge-coupled device (CCD) and corrective optics. Both colour video and digital still images could be taken. The video-camera was connected to a PC with software for database storage, image processing and telecommunication. We studied 88 normal subjects (38 male, 50 female), aged 7-62 years. It was not necessary to use mydriatic eye drops for pupillary dilation. Satisfactory digital images of the whole face and the anterior eye were obtained. The optic disc and the central part of the ocular fundus could also be recorded. Image quality of the face and the anterior eye were excellent; image quality of the optic disc and macula were good enough for tele-ophthalmology. Further studies are needed to evaluate the usefulness of the equipment in different clinical conditions", "keyphrases": ["hand-held digital colour video camera", "eye examination", "primary care", "charge-coupled device", "corrective optics", "digital still images", "colour video images", "PC", "software", "database storage", "image processing", "telecommunication", "normal subjects", "whole face", "anterior eye", "optic disc", "ocular fundus", "image quality", "tele-ophthalmology", "clinical conditions", "follow-up"]} -{"id": "152", "title": "Linear tense logics of increasing sets", "abstract": "We provide an extension of the language of linear tense logic with future and past connectives F and P, respectively, by a modality that quantifies over the points of some set which is assumed to increase in the course of time. In this way we obtain a general framework for modelling growth qualitatively. We develop an appropriate logical system, prove a corresponding completeness and decidability result and discuss the various kinds of flow of time in the new context. We also consider decreasing sets briefly", "keyphrases": ["linear tense logic", "future and past connectives", "logical system", "completeness", "decidability", "decreasing sets", "temporal reasoning"]} -{"id": "1542", "title": "The open-source HCS project", "abstract": "Despite the rumors, the HCS II project is not dead. In fact, HCS has been licensed and is now an open-source project. In this article, the author brings us up to speed on the HCS II project's past, present, and future. The HCS II is an expandable, standalone, network-based (RS-485), intelligent-node, industrial-oriented supervisory control (SC) system intended for demanding home control applications. The HCS incorporates direct and remote digital inputs and outputs, direct and remote analog inputs and outputs, real time or Boolean decision event triggering, X10 transmission and reception, infrared remote control transmission and reception, remote LCDs, and a master console. Its program is compiled on a PC with the XPRESS compiler and then downloaded to the SC where it runs independently of the PC", "keyphrases": ["HCS II", "supervisory control system", "home control", "network-based"]} -{"id": "1507", "title": "Ethnography, customers, and negotiated interactions at the airport", "abstract": "In the late 1990s, tightly coordinated airline schedules unraveled owing to massive delays resulting from inclement weather, overbooked flights, and airline operational difficulties. As schedules slipped, the delayed departures and late arrivals led to systemwide breakdowns, customers missed their connections, and airline work activities fell further out of sync. In offering possible answers, we emphasize the need to consider the customer as participant, following the human-centered computing model. Our study applied ethnographic methods to understand the airline system domain and the nature of airline delays, and it revealed the deficiencies of the airline production system model of operations. The research insights that led us to shift from a production and marketing system perspective to a customer-as-participant view might appear obvious to some readers. However, we do not know of any airline that designs its operations and technologies around any other model than the production and marketing system view. Our human-centered analysis used ethnographic methods to gather information, offering new insight into airline delays and suggesting effective ways to improve operations reliability", "keyphrases": ["human-centered computing model", "customer trajectories", "airports", "employees", "ethnography", "negotiated interactions", "airline delays", "airline production system operations model", "customer-as-participant view", "operations reliability"]} -{"id": "1643", "title": "Effectiveness of user testing and heuristic evaluation as a function of performance classification", "abstract": "For different levels of user performance, different types of information are processed and users will make different types of errors. Based on the error's immediate cause and the information being processed, usability problems can be classified into three categories. They are usability problems associated with skill-based, rule-based and knowledge-based levels of performance. In this paper, a user interface for a Web-based software program was evaluated with two usability evaluation methods, user testing and heuristic evaluation. The experiment discovered that the heuristic evaluation with human factor experts is more effective than user testing in identifying usability problems associated with skill-based and rule-based levels of performance. User testing is more effective than heuristic evaluation in finding usability problems associated with the knowledge-based level of performance. The practical application of this research is also discussed in the paper", "keyphrases": ["user testing", "heuristic evaluation", "performance classification", "user performance", "usability", "knowledge-based performance levels", "skill-based performance levels", "user interface", "Web-based software", "experiment", "human factors", "rule-based performance levels"]} -{"id": "1606", "title": "Single machine earliness-tardiness scheduling with resource-dependent release dates", "abstract": "This paper deals with the single machine earliness and tardiness scheduling problem with a common due date and resource-dependent release dates. It is assumed that the cost of resource consumption of a job is a non-increasing linear function of the job release date, and this function is common for all jobs. The objective is to find a schedule and job release dates that minimize the total resource consumption, and earliness and tardiness penalties. It is shown that the problem is NP-hard in the ordinary sense even if the due date is unrestricted (the number of jobs that can be scheduled before the due date is unrestricted). An exact dynamic programming (DP) algorithm for small and medium size problems is developed. A heuristic algorithm for large-scale problems is also proposed and the results of a computational comparison between heuristic and optimal solutions are discussed", "keyphrases": ["single machine earliness-tardiness scheduling", "resource-dependent release dates", "common due date", "job resource consumption cost", "nonincreasing linear function", "job release date", "total resource consumption minimization", "NP-hard problem", "exact dynamic programming algorithm", "medium size problems", "small size problems", "heuristic algorithm", "large-scale problems", "polynomial time algorithm"]} -{"id": "1875", "title": "The design and implementation of VAMPIRE", "abstract": "We describe VAMPIRE: a high-performance theorem prover for first-order logic. As our description is mostly targeted to the developers of such systems and specialists in automated reasoning, it focuses on the design of the system and some key implementation features. We also analyze the performance of the prover at CASC-JC", "keyphrases": ["VAMPIRE", "high-performance theorem prover", "first-order logic", "automated reasoning", "performance evaluation", "CASC-JC", "resolution theorem proving"]} -{"id": "1830", "title": "Approximation of pathwidth of outerplanar graphs", "abstract": "There exists a polynomial time algorithm to compute the pathwidth of outerplanar graphs, but the large exponent makes this algorithm impractical. In this paper, we give an algorithm that, given a biconnected outerplanar graph G, finds a path decomposition of G of pathwidth at most twice the pathwidth of G plus one. To obtain the result, several relations between the pathwidth of a biconnected outerplanar graph and its dual are established", "keyphrases": ["pathwidth approximation", "outerplanar graphs", "polynomial time algorithm", "biconnected outerplanar graph", "path decomposition"]} -{"id": "1888", "title": "L/sub 2/ model reduction and variance reduction", "abstract": "We examine certain variance properties of model reduction. The focus is on L/sub 2/ model reduction, but some general results are also presented. These general results can be used to analyze various other model reduction schemes. The models we study are finite impulse response (FIR) and output error (OE) models. We compare the variance of two estimated models. The first one is estimated directly from data and the other one is computed by reducing a high order model, by L/sub 2/ model reduction. In the FIR case we show that it is never better to estimate the model directly from data, compared to estimating it via L/sub 2/ model reduction of a high order FIR model. For OE models we show that the reduced model has the same variance as the directly estimated one if the reduced model class used contains the true system", "keyphrases": ["L/sub 2/ model reduction", "variance reduction", "finite impulse response models", "FIR models", "output error models", "identification"]} -{"id": "1462", "title": "Non-linear analysis of nearly saturated porous media: theoretical and numerical formulation", "abstract": "A formulation for a porous medium saturated with a compressible fluid undergoing large elastic and plastic deformations is presented. A consistent thermodynamic formulation is proposed for the two-phase mixture problem; thus preserving a straightforward and robust numerical scheme. A novel feature is the specification of the fluid compressibility in terms of a volumetric logarithmic strain, which is energy conjugated to the fluid pressure in the entropy inequality. As a result, the entropy inequality is used to separate three different mechanisms representing the response: effective stress response according to Terzaghi in the solid skeleton, fluid pressure response to compressibility of the fluid, and dissipative Darcy flow representing the interaction between the two phases. The paper is concluded with a couple of numerical examples that display the predictive capabilities of the proposed formulation. In particular, we consider results for the kinematically linear theory as compared to the kinematically non-linear theory", "keyphrases": ["nearly saturated porous media", "nonlinear analysis", "compressible fluid", "large elastic deformations", "large plastic deformations", "consistent thermodynamic formulation", "two-phase mixture problem", "robust numerical scheme", "fluid compressibility", "volumetric logarithmic strain", "fluid pressure", "entropy inequality", "effective stress response", "solid skeleton", "fluid pressure response", "dissipative Darcy flow", "predictive capabilities", "kinematically linear theory", "kinematically nonlinear theory"]} -{"id": "1726", "title": "Two-layer model for the formation of states of the hidden Markov chains", "abstract": "Procedures for the formation of states of the hidden Markov models are described. Formant amplitudes and frequencies are used as state features. The training strategy is presented that allows one to calculate the parameters of conditional probabilities of the generation of a given formant set by a given hidden state with the help of the maximum likelihood method", "keyphrases": ["hidden Markov models", "formant amplitudes", "formant frequencies", "state features", "conditional probabilities", "hidden state", "maximum likelihood method"]} -{"id": "1763", "title": "Numerical studies of 2D free surface waves with fixed bottom", "abstract": "The motion of surface waves under the effect of bottom is a very interesting and challenging phenomenon in the nature. we use boundary integral method to compute and analyze this problem. In the linear analysis, the linearized equations have bounded error increase under some compatible conditions. This contributes to the cancellation of instable Kelvin-Helmholtz terms. Under the effect of bottom, the existence of equations is hard to determine, but given some limitations it proves true. These limitations are that the swing of interfaces should be small enough, and the distance between surface and bottom should be large enough. In order to maintain the stability of computation, some compatible relationship must be satisfied. In the numerical examples, the simulation of standing waves and breaking waves are calculated. And in the case of shallow bottom, we found that the behavior of waves are rather singular", "keyphrases": ["numerical studies", "2D free surface waves", "boundary integral method", "linear analysis", "linearized equations", "instable Kelvin-Helmholtz terms"]} -{"id": "1848", "title": "Contracting in the days of ebusiness", "abstract": "Putting electronic business on a sound foundation-model theoretically as well as technologically-is a central challenge for research as well as commercial development. This paper concentrates on the discovery and negotiation phase of concluding an agreement based on a contract. We present a methodology for moving seamlessly from a many-to-many relationship in the discovery phase to a one-to-one relationship in the contract negotiation phase. Making the content of contracts persistent is achieved by reconstructing contract templates by means of mereologic (logic of the whole-part relation). Possibly nested sub-structures of the contract template are taken as a basis for negotiation in a dialogical way. For the negotiation itself the contract templates are extended by implications (logical) and sequences (topical)", "keyphrases": ["electronic business", "discovery phase", "contracting", "many-to-many relationship", "one-to-one relationship", "contract negotiation phase", "mereologic", "contract templates", "nested sub-structure", "sequences", "implications"]} -{"id": "1910", "title": "Breast cancer: effectiveness of computer-aided diagnosis-observer study with independent database of mammograms", "abstract": "Evaluates the effectiveness of a computerized classification method as an aid to radiologists reviewing clinical mammograms for which the diagnoses were unknown to both the radiologists and the computer. Six mammographers and six community radiologists participated in an observer study. These 12 radiologists interpreted, with and without the computer aid, 110 cases that were unknown to both the 12 radiologist observers and the trained computer classification scheme. The radiologists' performances in differentiating between benign and malignant masses without and with the computer aid were evaluated with receiver operating characteristic (ROC) analysis. Two-tailed P values were calculated for the Student t test to indicate the statistical significance of the differences in performances with and without the computer aid. When the computer aid was used, the average performance of the 12 radiologists improved, as indicated by an increase in the area under the ROC curve (A/sub z/) from 0.93 to 0.96 (P<.001), by an increase in partial area under the ROC curve (/sub 0.9/0A'/sub z/) from 0.56 to 0.72 (P<.001), and by an increase in sensitivity from 94% to 98% (P=.022). No statistically significant difference in specificity was found between readings with and those without computer aid ( Delta +-0.014; P=.46; 95% Cl: -0.054, 0.026), where Delta is difference in specificity. When we analyzed results from the mammographers and community radiologists as separate groups, a larger improvement was demonstrated for the community radiologists. Computer-aided diagnosis can potentially help radiologists improve their diagnostic accuracy in the task of differentiating between benign and malignant masses seen on mammograms", "keyphrases": ["computerized classification method", "clinical mammograms", "observer study", "breast cancer", "computer-aided diagnosis", "independent database", "trained computer classification scheme", "radiologist observers", "benign masses", "malignant masses", "receiver operating characteristic analysis", "two-tailed P values", "Student t test", "statistical significance", "performances", "average performance", "receiver operating characteristic curve", "diagnostic accuracy", "computer aid", "mammographers", "community radiologists"]} -{"id": "192", "title": "New Jersey African American women writers and their publications: a study of identification from written and oral sources", "abstract": "This study examines the use of written sources, and personal interviews and informal conversations with individuals from New Jersey's religious, political, and educational community to identify African American women writers in New Jersey and their intellectual output. The focus on recognizing the community as an oral repository of history and then tapping these oral sources for collection development and acquisition purposes is supported by empirical and qualitative evidence. Findings indicate that written sources are so limited that information professionals must rely on oral sources to uncover local writers and their publications", "keyphrases": ["New Jersey African American women writers", "written sources", "personal interviews", "informal conversations", "intellectual output", "oral repository", "history", "collection development", "local writers", "special collections"]} -{"id": "1683", "title": "Unlocking the potential of videoconferencing", "abstract": "I propose in this paper to show, through a number of case studies, that videoconferencing is user-friendly, cost-effective, time-effective and life-enhancing for people of all ages and abilities and that it requires only a creative and imaginative approach to unlock its potential. I believe that these benefits need not, and should not, be restricted to the education sector. My examples will range from simple storytelling, through accessing international experts, professional development and distance learning in a variety of forms, to the use of videoconferencing for virtual meetings and planning sessions. In some cases, extracts from the reactions and responses of the participants will be included to illustrate the impact of the medium", "keyphrases": ["videoconferencing", "benefits", "case studies", "education"]} -{"id": "1724", "title": "A winning combination [wireless health care]", "abstract": "Three years ago, the Institute of Medicine (IOM) reported that medical errors result in at least 44,000 deaths each year-more than deaths from highway accidents, breast cancer or AIDS. That report, and others which placed serious errors as high as 98,000 annually, served as a wake-up call for healthcare providers such as the CareGroup Healthcare System Inc., a Boston-area healthcare network that is the second largest integrated delivery system in the northeastern United States. With annual revenues of $1.2B, CareGroup provides primary care and specialty services to more than 1,000,000 patients. CareGroup combined wireless technology with the Web to create a provider order entry (POE) system designed to reduce the frequency of costly medical mistakes. The POE infrastructure includes InterSystems Corporation's CACHE database, Dell Computer C600 laptops and Cisco Systems' Aironet 350 wireless networks", "keyphrases": ["CareGroup Healthcare System", "healthcare network", "wireless", "medical errors", "provider order entry", "InterSystems Corporation CACHE database", "Cisco Systems Aironet 350 wireless networks", "Dell Computer C600 laptops"]} -{"id": "1761", "title": "Superconvergence of discontinuous Galerkin method for nonstationary hyperbolic equation", "abstract": "For the first order nonstationary hyperbolic equation taking the piecewise linear discontinuous Galerkin solver, we prove that under the uniform rectangular partition, such a discontinuous solver, after postprocessing, can have two and half approximative order which is half order higher than the optimal estimate by P. Lesaint and P. Raviart (1974) under the rectangular partition", "keyphrases": ["superconvergence of discontinuous Galerkin method", "nonstationary hyperbolic equation", "piecewise linear discontinuous Galerkin solver", "rectangular partition", "approximative order"]} -{"id": "1681", "title": "One and two facility network design revisited", "abstract": "The one facility one commodity network design problem (OFOC) with nonnegative flow costs considers the problem of sending d units of flow from a source to a destination where arc capacity is purchased in batches of C units. The two facility problem (TFOC) is similar, but capacity can be purchased either in batches of C units or one unit. Flow costs are zero. These problems are known to be NP-hard. We describe an exact O(n/sup 3/3/sup n/) algorithm for these problems based on the repeated use of a bipartite matching algorithm. We also present a better lower bound of Omega (n/sup 2k*/) for an earlier Omega (n/sup 2k/) algorithm described in the literature where k = [d/C] and k* = min{k, [(n 2)/2]}. The matching algorithm is faster than this one for k >or= [(n - 2)/2]. Finally, we provide another reformulation of the problem that is quasi integral. This property could be useful in designing a modified version of the simplex method to solve the problem using a sequence of pivots with integer extreme solutions, referred to as the integral simplex method in the literature", "keyphrases": ["one facility one commodity network design problem", "two facility network design", "nonnegative flow costs", "flow costs", "NP-hard problems", "exact algorithm", "bipartite matching algorithm", "lower bound", "quasi integral", "pivots", "integral simplex method"]} -{"id": "1538", "title": "A heuristic approach to resource locations in broadband networks", "abstract": "In broadband networks, such as ATM, the importance of dynamic migration of data resources is increasing because of its potential to improve performance especially for transaction processing. In environments with migratory data resources, it is necessary to have mechanisms to manage the locations of each data resource. In this paper, we present an algorithm that makes use of system state information and heuristics to manage locations of data resources in a distributed network. In the proposed algorithm, each site maintains information about state of other sites with respect to each data resource of the system and uses it to find: (1) a subset of sites likely to have the requested data resource; and (2) the site where the data resource is to be migrated from the current site. The proposed algorithm enhances its effectiveness by continuously updating system state information stored at each site. It focuses on reducing the overall average time delay needed by the transaction requests to locate and access the migratory data resources. We evaluated the performance of the proposed algorithm and also compared it with one of the existing location management algorithms, by simulation studies under several system parameters such as the frequency of requests generation, frequency of data resource migrations, network topology and scale of network. The experimental results show the effectiveness of the proposed algorithm in all cases", "keyphrases": ["broadband networks", "ATM", "resource locations", "heuristics", "distributed network", "data resource migrations", "network topology"]} -{"id": "1912", "title": "A novel preterm respiratory mechanics active simulator to test the performances of neonatal pulmonary ventilators", "abstract": "A patient active simulator is proposed which is capable of reproducing values of the parameters of pulmonary mechanics of healthy newborns and preterm pathological infants. The implemented prototype is able to: (a) let the operator choose the respiratory pattern, times of apnea, episodes of cough, sobs, etc., (b) continuously regulate and control the parameters characterizing the pulmonary system; and, finally, (c) reproduce the attempt of breathing of a preterm infant. Taking into account both the limitation due to the chosen application field and the preliminary autocalibration phase automatically carried out by the proposed device, accuracy and reliability on the order of 1% is estimated. The previously indicated value has to be considered satisfactory in light of the field of application and the small values of the simulated parameters. Finally, the achieved metrological characteristics allow the described neonatal simulator to be adopted as a reference device to test performances of neonatal ventilators and, more specifically, to measure the time elapsed between the occurrence of a potentially dangerous condition to the patient and the activation of the corresponding alarm of the tested ventilator", "keyphrases": ["preterm respiratory mechanics active simulator", "neonatal pulmonary ventilators", "patient active simulator", "healthy newborns", "preterm pathological infants", "apnea times", "autocalibration phase", "accuracy", "reliability", "respiratory diseases", "ventilatory support", "intensive care equipment", "electronic unit", "pneumatic/mechanical unit", "software control", "double compartment model", "artificial trachea", "pressure transducer", "variable clamp resistance", "upper airway resistance", "compliance"]} -{"id": "190", "title": "On the design of gain-scheduled trajectory tracking controllers [AUV application]", "abstract": "A new methodology is proposed for the design of trajectory tracking controllers for autonomous vehicles. The design technique builds on gain scheduling control theory. An application is made to the design of a trajectory tracking controller for a prototype autonomous underwater vehicle (AUV). The effectiveness and advantages of the new control laws derived are illustrated in simulation using a full set of non-linear equations of motion of the vehicle", "keyphrases": ["gain-scheduled trajectory tracking controller design", "autonomous vehicles", "gain scheduling control theory", "autonomous underwater vehicle", "control laws", "nonlinear equations of motion"]} -{"id": "1639", "title": "New hub gears up for algorithmic exchange", "abstract": "Warwick University in the UK is on the up and up. Sometimes considered a typical 1960s, middle-of-the-road redbrick institution-not known for their distinction the 2001 UK Research Assessment Exercise (RAE) shows its research to be the fifth most highly-rated in the country, with outstanding standards in the sciences. This impressive performance has rightly given Warwick a certain amount of muscle, which it is flexing rather effectively, aided by a snappy approach to making things happen that leaves some older institutions standing. The result is a brand new Centre for Scientific Computing (CSC), launched within a couple of years of its initial conception", "keyphrases": ["Warwick University Centre for Scientific Computing"]} -{"id": "1641", "title": "Development through gaming", "abstract": "Mainstream observers commonly underestimate the role of fringe activities in propelling science and technology. Well-known examples are how wars have fostered innovation in areas such as communications, cryptography, medicine and aerospace; and how erotica has been a major factor in pioneering visual media, from the first printed books to photography, cinematography, videotape, or the latest online video streaming. The article aims to be a sampler of a less controversial, but still often underrated, symbiosis between scientific computing and computing for leisure and entertainment", "keyphrases": ["computer games", "scientific computing", "leisure", "entertainment", "graphics"]} -{"id": "1604", "title": "Improving supply-chain performance by sharing advance demand information", "abstract": "In this paper, we analyze how sharing advance demand information (ADI) can improve supply-chain performance. We consider two types of ADI, aggregated ADI (A-ADI) and detailed ADI (D-ADI). With A-ADI, customers share with manufacturers information about whether they will place an order for some product in the next time period, but do not share information about which product they will order and which of several potential manufacturers will receive the order. With D-ADI, customers additionally share information about which product they will order, but which manufacturer will receive the order remains uncertain. We develop and solve mathematical models of supply chains where ADI is shared. We derive exact expressions and closed-form approximations for expected costs, expected base-stock levels, and variations of the production quantities. We show that both the manufacturer and the customers benefit from sharing ADI, but that sharing ADI increases the bullwhip effect. We also show that under certain conditions it is optimal to collect ADI from either none or all of the customers. We study two supply chains in detail: a supply chain with an arbitrary number of products that have identical demand rates, and a supply chain with two products that have arbitrary demand rates. For these two supply chains, we analyze how the values of A-ADI and D-ADI depend on the characteristics of the supply chain and on the quality of the shared information, and we identify conditions under which sharing A-ADI and D-ADI can significantly reduce cost. Our results can be used by decision makers to analyze the cost savings that can be achieved by sharing ADI and help them to determine if sharing ADI is beneficial for their supply chains", "keyphrases": ["supply-chain performance improvement", "advance demand information", "aggregated ADI", "detailed ADI", "information sharing", "manufacturing", "mathematical models", "closed-form approximations", "expected costs", "expected base-stock levels", "production quantity variations", "bullwhip effect", "arbitrary product number", "identical demand rates", "arbitrary demand rates", "shared information quality", "decision makers", "cost savings", "forecasting"]} -{"id": "150", "title": "Model checking games for branching time logics", "abstract": "This paper defines and examines model checking games for the branching time temporal logic CTL*. The games employ a technique called focus which enriches sets by picking out one distinguished element. This is necessary to avoid ambiguities in the regeneration of temporal operators. The correctness of these games is proved, and optimizations are considered to obtain model checking games for important fragments of CTL*. A game based model checking algorithm that matches the known lower and upper complexity bounds is sketched", "keyphrases": ["model checking games", "branching time logics", "temporal logic", "temporal operators", "complexity bounds"]} -{"id": "1540", "title": "Adaptive thinning for bivariate scattered data", "abstract": "This paper studies adaptive thinning strategies for approximating a large set of scattered data by piecewise linear functions over triangulated subsets. Our strategies depend on both the locations of the data points in the plane, and the values of the sampled function at these points - adaptive thinning. All our thinning strategies remove data points one by one, so as to minimize an estimate of the error that results by the removal of a point from the current set of points (this estimate is termed \"anticipated error\"). The thinning process generates subsets of \"most significant\" points, such that the piecewise linear interpolants over the Delaunay triangulations of these subsets approximate progressively the function values sampled at the original scattered points, and such that the approximation errors are small relative to the number of points in the subsets. We design various methods for computing the anticipated error at reasonable cost, and compare and test the performance of the methods. It is proved that for data sampled from a convex function, with the strategy of convex triangulation, the actual error is minimized by minimizing the best performing measure of anticipated error. It is also shown that for data sampled from certain quadratic polynomials, adaptive thinning is equivalent to thinning which depends only on the locations of the data points - nonadaptive thinning. Based on our numerical tests and comparisons, two practical adaptive thinning algorithms are proposed for thinning large data sets, one which is more accurate and another which is faster", "keyphrases": ["adaptive thinning", "scattered data", "piecewise linear functions", "triangulated subsets", "error", "Delaunay triangulations", "convex function"]} -{"id": "1505", "title": "Modeling and simulating practices, a work method for work systems design", "abstract": "Work systems involve people engaging in activities over time-not just with each other, but also with machines, tools, documents, and other artifacts. These activities often produce goods, services, or-as is the case in the work system described in this article-scientific data. Work systems and work practice evolve slowly over time. The integration and use of technology, the distribution and collocation of people, organizational roles and procedures, and the facilities where the work occurs largely determine this evolution", "keyphrases": ["work practice simulation", "work practice modeling", "work system design method", "complex system interactions", "human activity", "communication", "collaboration", "teamwork", "tool usage", "workspace usage", "problem solving", "learning behavior"]} -{"id": "1877", "title": "Strong completeness of lattice-valued logic", "abstract": "This paper shows strong completeness of the system L for lattice valued logic given by S. Titani (1999), in which she formulates a lattice-valued set theory by introducing the logical implication which represents the order relation on the lattice. Syntax and semantics concerned are described and strong completeness is proved", "keyphrases": ["strong completeness", "lattice-valued set theory", "order relation", "semantics", "syntax", "lattice-valued logic"]} -{"id": "1832", "title": "A linear time algorithm for recognizing regular Boolean functions", "abstract": "A positive (or monotone) Boolean function is regular if its variables are naturally ordered, left to fight, by decreasing strength, so that shifting the nonzero component of any true vector to the left always yields another true vector. This paper considers the problem of recognizing whether a positive function f is regular, where f is given by min T(f) (the set of all minimal true vectors of f). We propose a simple linear time (i.e., O(n|min T(f)|)-time) algorithm for it. This improves upon the previous algorithm by J.S. Provan and M.O. Ball (1988) which requires O(n/sup 2/|min T(f)|) time. As a corollary, we also present an O(n(n+|min T(f)|))-time algorithm for the recognition problem of 2-monotonic functions", "keyphrases": ["linear time algorithm", "regular Boolean functions", "monotone Boolean function", "nonzero component", "true vector", "positive function", "2-monotonic functions"]} -{"id": "1719", "title": "The UPS as network management tool", "abstract": "Uninterrupted power supplies (UPS), or battery backup systems, once provided a relatively limited, although important, function-continual battery support to connected equipment in the event of a power failure. However, yesterday's \"battery in a box\" has evolved into a sophisticated network power management tool that can monitor and actively correct many of the problems that might plague a healthy network. This new breed of UPS system provides such features as automatic voltage regulation, generous runtimes and unattended system shutdown, and now also monitors and automatically restarts critical services and operating systems if they lock up or otherwise fail", "keyphrases": ["uninterrupted power supplies", "network power management", "unattended system shutdown", "automatic voltage regulation"]} -{"id": "174", "title": "The BIOGENES system for knowledge-based bioprocess control", "abstract": "The application of knowledge-based control systems in the area of biotechnological processes has become increasingly popular over the past decade. This paper outlines the structure of the advanced knowledge-based part of the BIOGENES Copyright control system for the control of bioprocesses such as the fed-batch Saccharomyces cerevisiae cultivation. First, a brief overview of all the tasks implemented in the knowledge-based level including process data classification, qualitative process state identification and supervisory process control is given. The procedures performing the on-line identification of metabolic states and supervisory process control (setpoint calculation and control strategy selection) are described in more detail. Finally, the performance of the system is discussed using results obtained from a number of experimental cultivation runs in a laboratory unit", "keyphrases": ["BIOGENES system", "knowledge-based bioprocess control", "biotechnological processes", "fed-batch Saccharomyces cerevisiae cultivation", "process data classification", "qualitative process state identification", "supervisory process control", "online identification", "metabolic states", "experiment"]} -{"id": "1564", "title": "Asymptotic normality for the K/sub phi /-divergence goodness-of-fit tests", "abstract": "In this paper for a wide class of goodness-of-fit statistics based K/sub phi /-divergences, the asymptotic normality is established under the assumption n/m/sub n/ to a in (0, infinity ), where n denotes sample size and m/sub n/ the number of cells. This result is extended to contiguous alternatives to study asymptotic efficiency", "keyphrases": ["asymptotic normality", "asymptotic efficiency", "K/sub phi /-divergence goodness-of-fit tests"]} -{"id": "1521", "title": "Optimal multi-degree reduction of Bezier curves with constraints of endpoints continuity", "abstract": "Given a Bezier curve of degree n, the problem of optimal multi-degree reduction (degree reduction of more than one degree) by a Bezier curve of degree m (mor=0) orders can be preserved at two endpoints respectively. The method in the paper performs multi-degree reduction at one time and does not need stepwise computing. When applied to multi-degree reduction with endpoint continuity of any order, the MDR by L/sub 2/ obtains the best least squares approximation. Comparison with another method of multi-degree reduction (MDR by L/sub infinity /), which achieves the nearly best uniform approximation with respect to L/sub infinity / norm, is also given. The approximate effect of the MDR by L/sub 2/ is better than that of the MDR by L/sub infinity /. Explicit approximate error analysis of the multi-degree reduction methods is presented", "keyphrases": ["optimal multi-degree reduction", "Bezier curves", "endpoint continuity constraints", "approximate method", "explicit solution", "endpoint interpolation", "least squares approximation", "uniform approximation", "explicit approximate error analysis"]} -{"id": "1698", "title": "Exact frequency-domain reconstruction for thermoacoustic tomography. I. Planar geometry", "abstract": "We report an exact and fast Fourier-domain reconstruction algorithm for thermoacoustic tomography in a planar configuration assuming thermal confinement and constant acoustic speed. The effects of the finite size of the detector and the finite length of the excitation pulse are explicitly included in the reconstruction algorithm. The algorithm is numerically and experimentally verified. We also demonstrate that the blurring caused by the finite size of the detector surface is the primary limiting factor on the resolution and that it can be compensated for by deconvolution", "keyphrases": ["medical diagnostic imaging", "exact frequency-domain reconstruction", "planar configuration", "thermal confinement", "constant acoustic speed", "blurring", "finite detector surface size", "primary limiting factor", "deconvolution", "resolution limitation", "excitation pulse", "reconstruction algorithm", "thermoacoustic tomography", "planar geometry"]} -{"id": "1665", "title": "How airlines and airports recover from schedule perturbations: a survey", "abstract": "The explosive growth in air traffic as well as the widespread adoption of Operations Research techniques in airline scheduling has given rise to tight flight schedules at major airports. An undesirable consequence of this is that a minor incident such as a delay in the arrival of a small number of flights can result in a chain reaction of events involving several flights and airports, causing disruption throughout the system. This paper reviews recent literature in the area of recovery from schedule disruptions. First we review how disturbances at a given airport could be handled, including the effects of runways and fixes. Then we study the papers on recovery from airline schedule perturbations, which involve adjustments in flight schedules, aircraft, and crew. The mathematical programming techniques used in ground holding are covered in some detail. We conclude the review with suggestions on how singular perturbation theory could play a role in analyzing disruptions to such highly sensitive schedules as those in the civil aviation industry", "keyphrases": ["air traffic management", "schedule perturbation", "operations research techniques", "airline scheduling", "tight flight schedules", "airports", "schedule disruptions", "recovery", "disturbance handling", "runways", "flight schedule adjustments", "aircraft adjustments", "crew adjustments", "mathematical programming techniques", "ground holding", "singular perturbation theory", "civil aviation industry"]} -{"id": "1620", "title": "Rapid Cauer filter design employing new filter model", "abstract": "The exact three-dimensional (3D) design of a coaxial Cauer filter employing a new filter model, a 3D field simulator and a circuit simulator, is demonstrated. Only a few iterations between the field simulator and the circuit simulator are necessary to meet a given specification", "keyphrases": ["Cauer filter", "filter design", "filter model", "3D design", "coaxial filter", "field simulator", "circuit simulator", "iterations", "bandpass filters"]} -{"id": "1599", "title": "Evaluating the best main battle tank using fuzzy decision theory with linguistic criteria evaluation", "abstract": "In this paper, experts' opinions are described in linguistic terms which can be expressed in trapezoidal (or triangular) fuzzy numbers. To make the consensus of the experts consistent, we utilize the fuzzy Delphi method to adjust the fuzzy rating of every expert to achieve the consensus condition. For the aggregate of many experts' opinions, we take the operation of fuzzy numbers to get the mean of fuzzy rating, x/sub ij/ and the mean of weight, w/sub .j/. In multi-alternatives and multi-attributes cases, the fuzzy decision matrix X=[x/sub ij/]/sub m*n/ is constructed by means of the fuzzy rating, x/sub ij/. Then, we can derive the aggregate fuzzy numbers by multiplying the fuzzy decision matrix with the corresponding fuzzy attribute weights. The final results become a problem of ranking fuzzy numbers. We also propose an easy procedure of using fuzzy numbers to rank aggregate fuzzy numbers A/sub i/. In this way, we can obtain the best selection for evaluating the system. For practical application, we propose an algorithm for evaluating the best main battle tank by fuzzy decision theory and comparing it with other methods", "keyphrases": ["battle tank evaluation", "fuzzy group decision making", "fuzzy decision theory", "linguistic criteria evaluation", "multiple criteria problems", "group decision making", "subjective-objective backgrounds", "trapezoidal fuzzy numbers", "triangular fuzzy numbers", "fuzzy Delphi method", "fuzzy rating", "consensus condition", "fuzzy number ranking", "fuzzy decision matrix", "aggregate fuzzy numbers", "fuzzy attribute weights"]} -{"id": "189", "title": "Identification of linear parameter varying models", "abstract": "We consider identification of a certain class of discrete-time nonlinear systems known as linear parameter varying system. We assume that inputs, outputs and the scheduling parameters are directly measured, and a form of the functional dependence of the system coefficients on the parameters is known. We show how this identification problem can be reduced to a linear regression, and provide compact formulae for the corresponding least mean square and recursive least-squares algorithms. We derive conditions on persistency of excitation in terms of the inputs and scheduling parameter trajectories when the functional dependence is of polynomial type. These conditions have a natural polynomial interpolation interpretation, and do not require the scheduling parameter trajectories to vary slowly. This method is illustrated with a simulation example using two different parameter trajectories", "keyphrases": ["linear parameter varying models", "identification", "discrete-time nonlinear systems", "scheduling parameters", "functional dependence", "system coefficients", "linear regression", "least mean square algorithms", "recursive least-squares algorithms", "persistency of excitation conditions", "scheduling parameter trajectories", "polynomial interpolation interpretation", "parameter trajectories", "time-varying systems"]} -{"id": "1778", "title": "HeLIN pilot mentoring scheme", "abstract": "The health care libraries unit coordinates, facilitates, and promotes continuing personal development for all staff in the Health Libraries and Information Network (HeLIN) of the Oxford Deanery (UK). It supports the development of a culture of lifelong learning and recognizes that CPD should help deliver organizational objectives, as well as enabling all staff to expand and fulfill their potential. A major emphasis for 2000 was to investigate ways of improving support for individual learning within the workplace. The group identified a need to build on existing informal support networks in order to provide additional learning opportunities and decided to investigate the feasibility of piloting a mentoring scheme. The objectives of the pilot were to increase understanding and knowledge of mentoring as a tool for CPD; to investigate existing mentoring schemes and their applicability for HeLIN; to develop a pilot mentoring scheme for HeLIN incorporating a program for accreditation of mentors; and to evaluate the scheme and disseminate the results. In order to identify current practice in this area, a literature review was carried out, and colleagues with an interest in or existing knowledge of mentoring schemes were contacted where possible. In the absence of clearly defined appraisal tools, all abstracts were read, and articles that met the following criteria were obtained and distributed to the group for review", "keyphrases": ["HeLIN pilot mentoring scheme", "health care libraries unit", "continuing personal development", "staff", "Health Libraries and Information Network", "lifelong learning", "informal support networks", "accreditation", "literature review", "midcareer librarians"]} -{"id": "1853", "title": "CherylAnn Silberer: all about process [accounting technologist]", "abstract": "Silberer's company, CompLete, is making a specialty of workflow process analysis", "keyphrases": ["CompLete", "workflow process analysis", "accounting technologist"]} -{"id": "1816", "title": "Hamiltonian modelling and nonlinear disturbance attenuation control of TCSC for improving power system stability", "abstract": "To tackle the obstacle of applying passivity-based control (PBC) to power systems, an affine non-linear system widely existing in power systems is formulated as a standard Hamiltonian system using a pre-feedback method. The port controlled Hamiltonian with dissipation (PCHD) model of a thyristor controlled serial compensator (TCSC) is then established corresponding with a revised Hamiltonian function. Furthermore, employing the modified Hamiltonian function directly as the storage function, a non-linear adaptive L/sub 2/ gain control method is proposed to solve the problem of L/sub 2/ gain disturbance attenuation for this Hamiltonian system with parametric perturbations. Finally, simulation results are presented to verify the validity of the proposed controller", "keyphrases": ["Hamiltonian modelling", "thyristor controlled serial compensator", "nonlinear disturbance attenuation control", "power system stability", "passivity-based control", "affine nonlinear system", "pre-feedback method", "port controlled Hamiltonian with dissipation model", "Hamiltonian function", "storage function", "nonlinear adaptive L/sub 2/ gain control method", "parametric perturbations"]} -{"id": "1484", "title": "Portfolio optimization and the random magnet problem", "abstract": "Diversification of an investment into independently fluctuating assets reduces its risk. In reality, movements of assets are mutually correlated and therefore knowledge of cross-correlations among asset price movements are of great importance. Our results support the possibility that the problem of finding an investment in stocks which exposes invested funds to a minimum level of risk is analogous to the problem of finding the magnetization of a random magnet. The interactions for this \"random magnet problem\" are given by the cross-correlation matrix C of stock returns. We find that random matrix theory allows us to make an estimate for C which outperforms the standard estimate in terms of constructing an investment which carries a minimum level of risk", "keyphrases": ["portfolio optimization", "fluctuating assets", "cross-correlations", "price movements", "investment", "stocks", "invested funds", "magnetization", "cross-correlation matrix", "minimum risk level", "spin glasses", "random magnet problem"]} -{"id": "1479", "title": "Agreeing with automated diagnostic aids: a study of users' concurrence strategies", "abstract": "Automated diagnostic aids that are less than perfectly reliable often produce unwarranted levels of disuse by operators. In the present study, users' tendencies to either agree or disagree with automated diagnostic aids were examined under conditions in which: (1) the aids were less than perfectly reliable but aided-diagnosis was still more accurate that unaided diagnosis; and (2) the system was completely opaque, affording users no additional information upon which to base a diagnosis. The results revealed that some users adopted a strategy of always agreeing with the aids, thereby maximizing the number of correct diagnoses made over several trials. Other users, however, adopted a probability-matching strategy in which agreement and disagreement rates matched the rate of correct and incorrect diagnoses of the aids. The probability-matching strategy, therefore, resulted in diagnostic accuracy scores that were lower than was maximally possible. Users who adopted the maximization strategy had higher self-ratings of problem-solving and decision-making skills, were more accurate in estimating aid reliabilities, and were more confident in their diagnosis on trials in which they agreed with the aids. The potential applications of these findings include the design of interface and training solutions that facilitate the adoption of the most effective concurrence strategies by users of automated diagnostic aids", "keyphrases": ["automated diagnostic aids", "user concurrence strategy", "probability-matching", "disagreement rates", "maximization", "problem-solving", "reliability", "complex systems", "fault diagnosis"]} -{"id": "1785", "title": "The effect of a male-oriented computer gaming culture on careers in the computer industry", "abstract": "If careers in the computer industry were viewed, it would be evident that there is a conspicuous gender gap between the number of male and female employees. The same gap can be observed at the college level where males are dominating females as to those who pursue and obtain a degree in computer science. The question that this research paper intends to show is: why are males so dominant when it comes to computer related matters? The author has traced this question back to the computer game. Computer games are a fun medium and provide the means for an individual to become computer literate through the engagement of spatial learning and cognitive processing abilities. Since such games are marketed almost exclusively to males, females have a distinct disadvantage. Males are more computer literate through the playing of computer games, and are provided with an easy lead-in to more advanced utilization of computers such as programming. Females tend to be turned off due to the male stereotypes and marketing associated with games and thus begins the gender gap", "keyphrases": ["careers", "computer industry", "gender gap", "computer science degree", "computer games", "computer literacy", "female employees", "spatial learning", "cognitive processing", "male stereotypes", "marketing"]} -{"id": "1893", "title": "Closed-loop model set validation under a stochastic framework", "abstract": "Deals with probabilistic model set validation. It is assumed that the dynamics of a multi-input multi-output (MIMO) plant is described by a model set with unstructured uncertainties, and identification experiments are performed in closed loop. A necessary and sufficient condition has been derived for the consistency of the model set with both the stabilizing controller and closed-loop frequency domain experimental data (FDED). In this condition, only the Euclidean norm of a complex vector is involved, and this complex vector depends linearly on both the disturbances and the measurement errors. Based on this condition, an analytic formula has been derived for the sample unfalsified probability (SUP) of the model set. Some of the asymptotic statistical properties of the SUP have also been briefly discussed. A numerical example is included to illustrate the efficiency of the suggested method in model set quality evaluation", "keyphrases": ["closed-loop model set validation", "stochastic framework", "probabilistic model set validation", "multi-input multi-output plant", "MIMO plant", "unstructured uncertainties", "necessary and sufficient condition", "stabilizing controller", "closed-loop frequency domain experimental data", "Euclidean norm", "complex vector", "asymptotic statistical properties", "robust control", "unstructured uncertainty"]} -{"id": "1700", "title": "Computation of unmeasured third-generation VCT views from measured views", "abstract": "We compute unmeasured cone-beam projections from projections measured by a third-generation helical volumetric computed tomography system by solving a characteristic problem for an ultrahyperbolic differential equation [John (1938)]. By working in the Fourier domain, we convert the second-order PDE into a family of first-order ordinary differential equations. A simple first-order integration is used to solve the ODES", "keyphrases": ["unmeasured third-generation VCT views computation", "measured views", "cone-beam projections", "characteristic problem solution", "ultrahyperbolic differential equation", "Fourier domain", "first-order ordinary differential equations", "simple first-order integration", "medical diagnostic imaging", "range conditions", "third-generation helical volumetric computed tomography system"]} -{"id": "1745", "title": "Approximate relaxed descent method for optimal control problems", "abstract": "We consider an optimal control problem for systems governed by ordinary differential equations with control constraints. Since no convexity assumptions are made on the data, the problem is reformulated in relaxed form. The relaxed state equation is discretized by the implicit trapezoidal scheme and the relaxed controls are approximated by piecewise constant relaxed controls. We then propose a combined descent and discretization method that generates sequences of discrete relaxed controls and progressively refines the discretization. Since here the adjoint of the discrete state equation is not defined, we use, at each iteration, an approximate derivative of the cost functional defined by discretizing the continuous adjoint equation and the integral involved by appropriate trapezoidal schemes. It is proved that accumulation points of sequences constructed by this method satisfy the strong relaxed necessary conditions for optimality for the continuous problem. Finally, the computed relaxed controls can be easily approximated by piecewise constant classical controls", "keyphrases": ["approximate relaxed descent method", "optimal control problems", "ordinary differential equations", "relaxed state equation discretization", "implicit trapezoidal scheme", "piecewise constant relaxed controls", "relaxed control approximation", "discrete relaxed control sequences", "discretization refinement", "discrete state equation", "cost functional approximate derivative", "trapezoidal schemes"]} -{"id": "1658", "title": "Chaos theory as a framework for studying information systems", "abstract": "This paper introduces chaos theory as a means of studying information systems. It argues that chaos theory, combined with new techniques for discovering patterns in complex quantitative and qualitative evidence, offers a potentially more substantive approach to understand the nature of information systems in a variety of contexts. The paper introduces chaos theory concepts by way of an illustrative research design", "keyphrases": ["chaos theory", "information systems", "pattern discovery", "complex quantitative evidence", "qualitative evidence"]} -{"id": "149", "title": "Extending Kamp's theorem to model time granularity", "abstract": "In this paper, a generalization of Kamp's theorem relative to the functional completeness of the until operator is proved. Such a generalization consists in showing the functional completeness of more expressive temporal operators with respect to the extension of the first-order theory of linear orders MFO[<] with an extra binary relational symbol. The result is motivated by the search of a modal language capable of expressing properties and operators suitable to model time granularity in omega -layered temporal structures", "keyphrases": ["Kamp's theorem", "functional completeness", "until operator", "temporal operators", "first-order theory", "linear orders", "binary relational symbol", "omega -layered temporal structures", "model time granularity"]} -{"id": "1559", "title": "A comparison theorem for the iterative method with the preconditioner (I + S/sub max/)", "abstract": "A.D. Gunawardena et al. (1991) have reported the modified Gauss-Seidel method with a preconditioner (I + S). In this article, we propose to use a preconditioner (I + S/sub max/) instead of (I + S). Here, S/sub max/ is constructed by only the largest element at each row of the upper triangular part of A. By using the lemma established by M. Neumann and R.J. Plemmons (1987), we get the comparison theorem for the proposed method. Simple numerical examples are also given", "keyphrases": ["iterative method", "preconditioner", "modified Gauss-Seidel method", "comparison theorem"]} -{"id": "1781", "title": "Making it to the major leagues: career movement between library and archival professions and from small college to large university libraries", "abstract": "Issues of career movement and change are examined between library and archival fields and from small colleges to large universities. Issues examined include professional education and training, initial career-planning and placement, continuing education, scouting and mentoring, job market conditions, work experience and personal skills, professional involvement, and professional association self-interest. This examination leads to five observations: 1. It is easier, in terms of career transitions, for a librarian to become an archivist than it is for an archivist to become a librarian; 2. The progression from a small college venue to a large research university is very manageable with the proper planning and experience; 3. At least three of the career elements-professional education, career-planning, and professional association self-interest-in their best moments provide a foundation that enables a future consideration of change between institutional types and professional areas and in their worst moments conspire against the midcareer professional in terms of change; 4. The elements of scouting, continuing education, work experience, and professional involvement offer the greatest assistance in career transitions; 5. The job market is the wildcard that either stymies or stimulates occupational development", "keyphrases": ["career movement", "library profession", "archival profession", "small college library", "large university libraries", "professional education", "training", "continuing education", "job market", "work experience", "personal skills", "librarian", "occupational development", "midcareer"]} -{"id": "1739", "title": "Application of normal possibility decision rule to silence", "abstract": "The paper presents the way of combining two decision problems concerning a single (or a common) dimension, so that an effective fuzzy decision rule can be obtained. Normality of the possibility distribution is assumed, leading to possibility of fusing the respective functions related to the two decision problems and their characteristics (decisions, states of nature, utility functions, etc.). The approach proposed can be applied in cases when the statement of the problem requires making of more refined distinctions rather than considering simply a bi-criterion or bi-utility two-decision problem", "keyphrases": ["normal possibility decision rule", "silence", "conflicting objectives", "conflicting utilities", "cool head", "warm heart", "decision problems", "two-dimensional fuzzy events"]} -{"id": "1812", "title": "Computing the frequency response of systems affinely depending on uncertain parameters", "abstract": "The computation of the frequency response of systems depending affinely on uncertain parameters can be reduced to that of all its one-dimensional edge plants while the image of such an edge plant at a fixed frequency is an arc or a line segment in the complex plane. Based on this conclusion, four computational formulas of the maximal and minimal (maxi-mini) magnitudes and phases of an edge plant at a fixed frequency are given. The formulas, besides sharing a simpler form of expression, concretely display how the extrema of the frequency response of the edge plant relate to the typical characteristics of the arc and line segment such as the centre, radius and tangent points of the arc, the distance from the origin to the line segment etc. The direct application of the results is to compute the Bode-, Nichols- and Nyquist-plot collections of the systems which are needed in robustness analysis and design", "keyphrases": ["frequency response", "uncertain parameters", "affine systems", "one-dimensional edge plants", "arc", "line segment", "Bode-plot", "Nichols-plot", "Nyquist-plot", "robustness analysis", "robustness design", "frequency-domain design methods"]} -{"id": "1480", "title": "Formal verification of human-automation interaction", "abstract": "This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training materials (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces", "keyphrases": ["formal verification", "human-automation interaction", "man-machine interaction", "automated control systems", "user interface", "autopilot", "commercial aircraft"]} -{"id": "1661", "title": "The road to perpetual progress [retail inventory management]", "abstract": "With annual revenues increasing 17.0% to 20.0% consistently over the last three years and more than 2,500 new stores opened from 1998 through 2001, Dollar General is on the fast track. However, the road to riches could have easily become the road to ruin had the retailer not exerted control over its inventory management", "keyphrases": ["Dollar General", "retailer", "inventory management"]} -{"id": "1624", "title": "Genetic algorithm for input/output selection in MIMO systems based on controllability and observability indices", "abstract": "A time domain optimisation algorithm using a genetic algorithm in conjunction with a linear search scheme has been developed to find the smallest or near-smallest subset of inputs and outputs to control a multi-input-multi-output system. Experimental results have shown that this proposed algorithm has a very fast convergence rate and high computation efficiency", "keyphrases": ["genetic algorithm", "input/output selection", "MIMO systems", "controllability indices", "observability indices", "time domain optimisation algorithm", "linear search scheme", "near-smallest subset", "smallest subset", "multi-input-multi-output system", "very fast convergence", "high computation efficiency", "multivariable control systems"]} -{"id": "170", "title": "The impact of the product mix on the value of flexibility", "abstract": "Product-mix flexibility is one of the major types of manufacturing flexibility, referring to the ability to produce a broad range of products or variants with presumed low changeover costs. The value of such a capability is important to establish for an industrial firm in order to ensure that the flexibility provided will be at the right level and used profitably rather than in excess of market requirements and consequently costly. We use option-pricing theory to analyse the impact of various product-mix issues on the value of flexibility. The real options model we use incorporates multiple products, capacity constraints as well as set-up costs. The issues treated here include the number of products, demand variability, correlation between products, and the relative demand distribution within the product mix. Thus, we are interested in the nature of the input data to analyse its effect on the value of flexibility. We also check the impact at different capacity levels. The results suggest that the value of flexibility (i) increases with an increasing number of products, (ii) decreases with increasing volatility of product demand, (iii) decreases the more positively correlated the demand is, and (iv) reduces for marginal capacity with increasing levels of capacity. Of these, the impact of positively correlated demand seems to be a major issue. However, the joint impact of the number of products and demand correlation showed some non-intuitive results", "keyphrases": ["product-mix flexibility", "flexible manufacturing", "manufacturing flexibility", "low changeover costs", "industrial firm", "option-pricing theory", "real options model", "multiple products", "capacity constraints", "set-up costs", "demand variability", "product correlation", "relative demand distribution", "product demand volatility", "marginal capacity", "positively correlated demand", "demand correlation", "capital budgeting"]} -{"id": "1560", "title": "Determinantal solutions of solvable chaotic systems", "abstract": "It is shown that two solvable chaotic systems, the arithmetic-harmonic mean (ARM) algorithm and the Ulam-von Neumann (UvN) map, have determinantal solutions. An additional formula for certain determinants and Riccati difference equations play a key role in both cases. Two infinite hierarchies of solvable chaotic systems are presented which have determinantal solutions", "keyphrases": ["determinantal solutions", "arithmetic-harmonic mean algorithm", "solvable chaotic systems", "Ulam-von Neumann map", "determinants", "Riccati difference equations", "Chebyshev polynomial"]} -{"id": "1525", "title": "Dependence graphs: dependence within and between groups", "abstract": "This paper applies the two-party dependence theory (Castelfranchi, Cesta and Miceli, 1992, in Y. Demazeau and E. Werner (Eds.) Decentralized AI-3, Elsevier, North Holland) to modelling multiagent and group dependence. These have theoretical potentialities for the study of emerging groups and collective structures, and more generally for understanding social and organisational complexity, and practical utility for both social-organisational and agent systems purposes. In the paper, the dependence theory is extended to describe multiagent links, with a special reference to group and collective phenomena, and is proposed as a framework for the study of emerging social structures, such as groups and collectives. In order to do so, we propose to extend the notion of dependence networks (applied to a single agent) to dependence graphs (applied to an agency). In its present version, the dependence theory is argued to provide (a) a theoretical instrument for the study of social complexity, and (b) a computational system for managing the negotiation process in competitive contexts and for monitoring complexity in organisational and other cooperative contexts", "keyphrases": ["dependence graphs", "group dependence", "two-party dependence theory", "multiagent dependence", "emerging groups", "collective structures", "multiagent systems", "organisational complexity", "social complexity", "agent systems", "dependence networks"]} -{"id": "1518", "title": "Explicit matrix representation for NURBS curves and surfaces", "abstract": "The matrix forms for curves and surfaces were largely promoted in CAD/CAM. In this paper we have presented two matrix representation formulations for arbitrary degree NURBS curves and surfaces explicitly other than recursively. The two approaches are derived from the computation of divided difference and the Marsden identity respectively. The explicit coefficient matrix of B-spline with equally spaced knot and Bezier curves and surfaces can be obtained by these formulae. The coefficient formulae and the coefficient matrix formulae developed in this paper express non-uniform B-spline functions of arbitrary degree in explicit polynomial and matrix forms.. They are useful for the evaluation and the conversion of NURBS curves and surfaces, in CAD/CAM systems", "keyphrases": ["explicit matrix representation", "NURBS curves", "NURBS surfaces", "CAD/CAM", "matrix representation formulations", "divided difference", "Marsden identity", "explicit coefficient matrix", "B-spline", "equally spaced knot", "Bezier curves", "Bezier surfaces", "coefficient formulae", "coefficient matrix formulae", "nonuniform B-spline functions", "explicit polynomial forms", "explicit matrix forms"]} -{"id": "1619", "title": "Rate allocation for video transmission over lossy correlated networks", "abstract": "A novel rate allocation algorithm for video transmission over lossy networks subject to bursty packet losses is presented. A Gilbert-Elliot model is used at the encoder to drive the selection of coding parameters. Experimental results using the H.26L test model show a significant performance improvement with respect to the assumption of independent packet losses", "keyphrases": ["rate allocation algorithm", "video transmission", "lossy correlated networks", "bursty packet losses", "Gilbert-Elliot model", "coding parameters", "H.26L test model", "video coding"]} -{"id": "1704", "title": "Statistical analysis of nonlinearly reconstructed near-infrared tomographic images. I. Theory and simulations", "abstract": "Near-infrared (NIR) diffuse tomography is an emerging method for imaging the interior of tissues to quantify concentrations of hemoglobin and exogenous chromophores noninvasively in vivo. It often exploits an optical diffusion model-based image reconstruction algorithm to estimate spatial property values from measurements of the light flux at the surface of the tissue. In this study, mean-squared error (MSE) over the image is used to evaluate methods for regularizing the ill-posed inverse image reconstruction problem in NIR tomography. Estimates of image bias and image standard deviation were calculated based upon 100 repeated reconstructions of a test image with randomly distributed noise added to the light flux measurements. It was observed that the bias error dominates at high regularization parameter values while variance dominates as the algorithm is allowed to approach the optimal solution. This optimum does not necessarily correspond to the minimum projection error solution, but typically requires further iteration with a decreasing regularization parameter to reach the lowest image error. Increasing measurement noise causes a need to constrain the minimum regularization parameter to higher values in order to achieve a minimum in the overall image MSE", "keyphrases": ["medical diagnostic imaging", "hemoglobin", "oxygen saturation", "photon migration", "optical diffusion model-based image reconstruction algorithm", "decreasing regularization parameter", "lowest image error", "minimum regularization parameter constraint", "bias error", "optimal solution", "light flux", "mean-squared error", "ill-posed inverse image reconstruction problem regularization", "spatial property values estimation", "test image", "randomly distributed noise", "O/sub 2/"]} -{"id": "1741", "title": "The top cycle and uncovered solutions for weak tournaments", "abstract": "We study axiomatic properties of the top cycle and uncovered solutions for weak tournaments. Subsequently, we establish its connection with the rational choice theory", "keyphrases": ["top cycle", "uncovered solutions", "weak tournaments", "axiomatic properties", "rational choice theory"]} -{"id": "1897", "title": "User-appropriate tyre-modelling for vehicle dynamics in standard and limit situations", "abstract": "When modelling vehicles for the vehicle dynamic simulation, special attention must be paid to the modelling of tyre forces and -torques, according to their dominant influence on the results. This task is not only about sufficiently exact representation of the effective forces but also about user-friendly and practical relevant applicability, especially when the experimental tyre-input-data is incomplete or missing. This text firstly describes the basics of the vehicle dynamic tyre model, conceived to be a physically based, semi-empirical model for application in connection with multi-body-systems (MBS). On the basis of tyres for a passenger car and a heavy truck the simulated steady state tyre characteristics are shown together and compared with the underlying experimental values. The possibility to link the tyre model TMeasy to any MBS-program is described, as far as it supports the 'Standard Tyre Interface'. As an example, the simulated and experimental data of a heavy truck doing a standardized driving manoeuvre are compared", "keyphrases": ["tyre modelling", "vehicle dynamics", "standard situations", "limit situations", "tyre torques", "semi-empirical model", "multi-body-systems", "passenger car", "heavy truck", "simulated steady state tyre characteristics", "TMeasy", "Standard Tyre Interface", "standardized driving manoeuvre"]} -{"id": "1916", "title": "Changes in the entropy and the Tsallis difference information during spontaneous decay and self-organization of nonextensive systems", "abstract": "A theoretical-information description of self-organization processes during stimulated transitions between stationary states of open nonextensive systems is presented. S/sub q/- and I/sub q/-theorems on changes of the entropy and Tsallis difference information measures in the process of evolution in the space of control parameters are proved. The entropy and the Tsallis difference information are derived and their new extreme properties are discussed", "keyphrases": ["entropy", "Tsallis difference information", "spontaneous decay", "self-organization", "nonextensive systems", "stimulated transitions", "information measures", "control parameters", "nonextensive statistical mechanics"]} -{"id": "1584", "title": "Content all clear [workflow & content management]", "abstract": "Graeme Muir of SchlumbergerSema cuts through the confusion between content, document and records management", "keyphrases": ["SchlumbergerSema", "content management", "document management", "records management"]} -{"id": "1678", "title": "Parallel interior point schemes for solving multistage convex programming", "abstract": "The predictor-corrector interior-point path-following algorithm is promising in solving multistage convex programming problems. Among many other general good features of this algorithm, especially attractive is that the algorithm allows the possibility to parallelise the major computations. The dynamic structure of the multistage problems specifies a block-tridiagonal system at each Newton step of the algorithm. A wrap-around permutation is then used to implement the parallel computation for this step", "keyphrases": ["parallel interior point schemes", "multistage convex programming", "predictor-corrector interior-point path-following algorithm", "dynamic structure", "block-tridiagonal system", "Newton step", "wrap-around permutation", "parallel computation"]} -{"id": "1685", "title": "Use of web technologies in construction project management: what are the critical success/failure factors?", "abstract": "A concept of how the World Wide Web (WWW) and its associated technologies can be used to manage construction projects has been recognized by practitioners in the construction industry for quite sometime. This concept is often referred to as a Web-Based Project Management System (WPMS). It promises, to enhance construction project documentation and control, and to revolutionize the way construction project teams process and transmit project information. WPMS is an electronic project-management system conducted through the Internet. The system provides a centralized, commonly accessible, reliable means of transmitting and storing project information. Project information is stored on the server and a standard Web browser is used as the gateway to exchange this information, eliminating geographic and hardware platforms boundary", "keyphrases": ["Web-Based Project Management System", "construction industry", "project documentation", "project control", "success", "implementation", "Web browser"]} -{"id": "169", "title": "MRP in a job shop environment using a resource constrained project scheduling model", "abstract": "One of the most difficult tasks in a job shop manufacturing environment is to balance schedule and capacity in an ongoing basis. MRP systems are commonly used for scheduling, although their inability to deal with capacity constraints adequately is a severe drawback. In this study, we show that material requirements planning can be done more effectively in a job shop environment using a resource constrained project scheduling model. The proposed model augments MRP models by incorporating capacity constraints and using variable lead time lengths. The efficacy of this approach is tested on MRP systems by comparing the inventory carrying costs and resource allocation of the solutions obtained by the proposed model to those obtained by using a traditional MRP model. In general, it is concluded that the proposed model provides improved schedules with considerable reductions in inventory carrying costs", "keyphrases": ["job shop environment", "MRP", "resource constrained project scheduling model", "material requirements planning", "scheduling", "capacity constraints", "variable lead time lengths", "inventory carrying costs", "resource allocation", "project management"]} -{"id": "1798", "title": "Robustness evaluation of a minimal RBF neural network for nonlinear-data-storage-channel equalisation", "abstract": "The authors present a performance-robustness evaluation of the recently developed minimal resource allocation network (MRAN) for equalisation in highly nonlinear magnetic recording channels in disc storage systems. Unlike communication systems, equalisation of signals in these channels is a difficult problem, as they are corrupted by data-dependent noise and highly nonlinear distortions. Nair and Moon (1997) have proposed a maximum signal to distortion ratio (MSDR) equaliser for data storage channels, which uses a specially designed neural network, where all the parameters of the neural network are determined theoretically, based on the exact knowledge of the channel model parameters. In the present paper, the performance of the MSDR equaliser is compared with that of the MRAN equaliser using a magnetic recording channel model, under Conditions that include variations in partial erasure, jitter, width and noise power, as well as model mismatch. Results from the study indicate that the less complex MRAN equaliser gives consistently better performance robustness than the MSDR equaliser in terms of signal to distortion ratios (SDRs)", "keyphrases": ["robustness evaluation", "minimal resource allocation network", "highly nonlinear magnetic recording channels", "disc storage systems", "nonlinear-data-storage-channel equalisation", "data-dependent noise", "highly nonlinear distortions", "maximum signal to distortion ratio equaliser", "RBF neural network", "MRAN equaliser", "MSDR equaliser", "digital magnetic recording", "jitter noise"]} -{"id": "1464", "title": "LR parsing for conjunctive grammars", "abstract": "The generalized LR parsing algorithm for context-free grammars, introduced by Tomita in 1986, is a polynomial-time implementation of nondeterministic LR parsing that uses graph-structured stack to represent the contents of the nondeterministic parser's pushdown for all possible branches of computation at a single computation step. It has been specifically developed as a solution for practical parsing tasks arising in computational linguistics, and indeed has proved itself to be very suitable for natural language processing. Conjunctive grammars extend context-free grammars by allowing the use of an explicit intersection operation within grammar rules. This paper develops a new LR-style parsing algorithm for these grammars, which is based on the very same idea of a graph-structured pushdown, where the simultaneous existence of several paths in the graph is used to perform the mentioned intersection operation. The underlying finite automata are treated in the most general way: instead of showing the algorithm's correctness for some particular way of constructing automata, the paper defines a wide class of automata usable with a given grammar, which includes not only the traditional LR(k) automata, but also, for instance, a trivial automaton with a single reachable state. A modification of the SLR(k) table construction method that makes use of specific properties of conjunctive grammars is provided as one possible way of making finite automata to use with the algorithm", "keyphrases": ["conjunctive grammars", "generalized LR parsing algorithm", "graph-structured stack", "nondeterministic parser pushdown", "computation", "computational linguistics", "natural language processing", "context-free grammars", "explicit intersection operation", "grammar rules", "finite automata", "trivial automaton", "single reachable state", "Boolean closure", "deterministic context-free languages"]} -{"id": "1499", "title": "A digital-driving system for smart vehicles", "abstract": "In the wake of the computer and information technology revolutions, vehicles are undergoing dramatic changes in their capabilities and how they interact with drivers. Although some vehicles can decide to either generate warnings for the human driver or control the vehicle autonomously, they must usually make these decisions in real time with only incomplete information. So, human drivers must still maintain control over the vehicle. I sketch a digital driving behavior model. By simulating and analyzing driver behavior during different maneuvers such as lane changing, lane following, and traffic avoidance, researchers participating in the Beijing Institute of Technology's digital-driving project will be able to examine the possible correlations or causal relations between the smart vehicle, IVISs, the intelligent road-traffic-information network, and the driver. We aim to successfully demonstrate that a digital-driving system can provide a direction for developing human-centered smart vehicles", "keyphrases": ["digital driving system", "human-centered smart vehicles", "in-vehicle information systems", "intelligence", "intelligent driver-vehicle interface", "ecological driver-vehicle interface", "vehicle control", "interactive communication", "intelligent road traffic information network", "intelligent transportation systems", "maneuvers", "traffic avoidance", "lane following", "lane changing"]} -{"id": "1765", "title": "On bandlimited scaling function", "abstract": "This paper discusses band-limited scaling function, especially the single interval band case and three interval band cases. Their relationship to oversampling property and weakly translation invariance are also studied. At the end, we propose an open problem", "keyphrases": ["bandlimited scaling function", "interval band case", "oversampling property", "weakly translation invariance"]} -{"id": "1873", "title": "A phytography of WALDMEISTER", "abstract": "The architecture of the WALDMEISTER prover for unit equational deduction is based on a strict separation of active and passive facts. After an inspection of the system's proof procedure, the representation of each of the central data structures is outlined, namely indexing for the active facts, compression for the passive facts, successor sets for the hypotheses, and minimal recording of inference steps for the proof object. In order to cope with large search spaces, specialized redundancy criteria are employed, and the empirically gained control knowledge is integrated to ease the use of the system. The paper concludes with a quantitative comparison of the WALDMEISTER versions over the years, and a view of the future prospects", "keyphrases": ["WALDMEISTER", "theorem prover", "unit equational deduction", "passive facts", "active facts", "data structures", "indexing", "hypotheses", "phytography", "CADE ATP System Competition", "inference", "large search spaces", "redundancy", "future prospects"]} -{"id": "1836", "title": "Parcel boundary identification with computer-assisted boundary overlay process for Taiwan", "abstract": "The study investigates the design of a process for parcel boundary identification with cadastral map overlay using the principle of least squares. The objective of this research is to provide an objective tool for boundary identification survey. The proposed process includes an adjustment model, a weighting scheme, and other related operations. A numerical example is included", "keyphrases": ["parcel boundary identification", "computer assisted boundary overlay process", "Taiwan", "cadastral map overlay", "objective tool", "boundary identification survey", "adjustment model", "weighting scheme", "Gauss-Marker model", "geographic information system", "weighted least squares adjustment"]} -{"id": "1758", "title": "Hilbert modular threefolds of arithmetic genus one", "abstract": "D. Weisser (1981) proved that there are exactly four Galois cubic number fields with Hilbert modular threefolds of arithmetic genus one. In this paper, we extend Weisser's work to cover all cubic number fields. Our main result is that there are exactly 33 fields with Hilbert modular threefolds of arithmetic genus one. These fields are enumerated explicitly", "keyphrases": ["Hilbert modular threefolds", "arithmetic genus one", "Galois cubic number fields"]} -{"id": "154", "title": "Verifying concurrent systems with symbolic execution", "abstract": "Current techniques for interactively proving temporal properties of concurrent systems translate transition systems into temporal formulas by introducing program counter variables. Proofs are not intuitive, because control flow is not explicitly considered. For sequential programs symbolic execution is a very intuitive, interactive proof strategy. In this paper we adopt this technique for parallel programs. Properties are formulated in interval temporal logic. An implementation in the interactive theorem prover KIV has shown that this technique offers a high degree of automation and allows simple, local invariants", "keyphrases": ["concurrent systems verification", "symbolic execution", "temporal properties", "concurrent systems", "transition systems", "temporal formulas", "program counter variables", "sequential programs", "parallel programs", "interactive theorem prover KIV", "local invariants"]} -{"id": "1544", "title": "Driving the NKK Smartswitch.2. Graphics and text", "abstract": "Whether your message is one of workplace safety or world peace, the long nights of brooding over ways to tell the world are over. Part 1 described the basic interface to drive the Smartswitch. Part 2 adds the bells and whistles to allow both text and messages to be placed anywhere on the screen. It considers character generation, graphic generation and the user interface", "keyphrases": ["NKK Smartswitch", "computer graphics", "text", "messages", "character generation", "graphic generation", "user interface"]} -{"id": "1501", "title": "Computational challenges in cell simulation: a software engineering approach", "abstract": "Molecular biology's advent in the 20th century has exponentially increased our knowledge about the inner workings of life. We have dozens of completed genomes and an array of high-throughput methods to characterize gene encodings and gene product operation. The question now is how we will assemble the various pieces. In other words, given sufficient information about a living cell's molecular components, can we predict its behavior? We introduce the major classes of cellular processes relevant to modeling, discuss software engineering's role in cell simulation, and identify cell simulation requirements. Our E-Cell project aims to develop the theories, techniques, and software platforms necessary for whole-cell-scale modeling, simulation, and analysis. Since the project's launch in 1996, we have built a variety of cell models, and we are currently developing new models that vary with respect to species, target subsystem, and overall scale", "keyphrases": ["cell simulation", "software engineering", "object-oriented design", "molecular biology", "E-Cell project", "whole-cell-scale modeling"]} -{"id": "1645", "title": "Effects of the transition to a client-centred team organization in administrative surveying work", "abstract": "A new work organization was introduced in administrative surveying work in Sweden during 1998. The new work organization implied a transition to a client-centred team-based organization and required a change in competence from specialist to generalist knowledge as well as a transition to a new information technology, implying a greater integration within the company. The aim of this study was to follow the surveyors for two years from the start of the transition and investigate how perceived consequences of the transition, job, organizational factors, well-being and effectiveness measures changed between 1998 and 2000. The Teamwork Profile and QPS Nordic questionnaire were used. The 205 surveyors who participated in all three study phases constituted the study group. The result showed that surveyors who perceived that they were working as generalists rated the improvements in job and organizational factors significantly higher than those who perceived that they were not yet generalists. Improvements were noted in 2000 in quality of service to clients, time available to handle a case and effectiveness of teamwork in a transfer to a team-based work organization group, cohesion and continuous improvement practices-for example, learning by doing, mentoring and guided delegation-were important to improve the social effectiveness of group work", "keyphrases": ["client-centred team organization", "administrative surveying work", "information technology", "company", "job", "organizational factors", "effectiveness measures", "Teamwork Profile", "QPS Nordic questionnaire", "social effectiveness", "public administrative sector"]} -{"id": "1600", "title": "The development and evaluation of a fuzzy logic expert system for renal transplantation assignment: Is this a useful tool?", "abstract": "Allocating donor kidneys to patients is a complex, multicriteria decision-making problem which involves not only medical, but also ethical and political issues. In this paper, a fuzzy logic expert system approach was proposed as an innovative way to deal with the vagueness and complexity faced by medical doctors in kidney allocation decision making. A pilot fuzzy logic expert system for kidney allocation was developed and evaluated in comparison with two existing allocation algorithms: a priority sorting system used by multiple organ retrieval and exchange (MORE) in Canada and a point scoring systems used by united network for organ sharing (UNOS) in US. Our simulated experiment based on real data indicated that the fuzzy logic system can represent the expert's thinking well in handling complex tradeoffs, and overall, the fuzzy logic derived recommendations were more acceptable to the expert than those from the MORE and UNOS algorithms", "keyphrases": ["renal transplantation assignment", "fuzzy logic expert system", "donor kidneys", "multicriteria decision-making problem", "kidney allocation decision making", "priority sorting system", "multiple organ retrieval exchange", "point scoring systems", "united network for organ sharing", "simulated experiment", "complex tradeoff handling"]} -{"id": "1871", "title": "Strong and weak points of the MUSCADET theorem prover-examples from CASC-JC", "abstract": "MUSCADET is a knowledge-based theorem prover based on natural deduction. It has participated in CADE Automated theorem proving System Competitions. The results show its complementarity with regard to resolution-based provers. This paper presents some of its crucial methods and gives some examples of MUSCADET proofs from the last competition (CASC-JC in IJCAR 2001)", "keyphrases": ["MUSCADET", "CASC-JC", "knowledge-based theorem prover", "natural deduction", "CADE Automated theorem proving System Competitions", "resolution-based provers"]} -{"id": "1834", "title": "A formal model of correctness in a cadastre", "abstract": "A key issue for cadastral systems is the maintenance of their correctness. Correctness is defined to be the proper correspondence between the valid legal situation and the content of the cadastre. This correspondence is generally difficult to achieve, since the cadastre is not a complete representation of all aspects influencing the legal situation in reality. The goal of the paper is to develop a formal model comprising representations of the cadastre and of reality that allows the simulation and investigation of cases where this correspondence is potentially violated. For this purpose the model consists of two parts, the first part represents the valid legal situation and the second part represents the cadastre. This makes it feasible to mark the differences between reality and the cadastre. The marking together with the two parts of the model facilitate the discussion of issues in \"real-world\" cadastral systems where incorrectness occurs. In order to develop a formal model, the paper uses the transfer of ownership of a parcel between two persons as minimal case study. The foundation for the formalization is a modern version of the situation calculus. The focus moves from the analysis of the cadastre to the preparation of a conceptual and a formalized model and the implementation of a prototype", "keyphrases": ["formal correctness model", "cadastre", "cadastral systems", "correctness maintenance", "legal situation", "formal model", "transfer of ownership", "minimal case study", "situation calculus", "formalized model"]} -{"id": "1647", "title": "Examining children's reading performance and preference for different computer-displayed text", "abstract": "This study investigated how common online text affects reading performance of elementary school-age children by examining the actual and perceived readability of four computer-displayed typefaces at 12- and 14-point sizes. Twenty-seven children, ages 9 to 11, were asked to read eight children's passages and identify erroneous/substituted words while reading. Comic Sans MS, Arial and Times New Roman typefaces, regardless of size, were found to be more readable (as measured by a reading efficiency score) than Courier New. No differences in reading speed were found for any of the typeface combinations. In general, the 14-point size and the examined sans serif typefaces were perceived as being the easiest to read, fastest, most attractive, and most desirable for school-related material. In addition, participants significantly preferred Comic Sans MS and 14-point Arial to 12-point Courier. Recommendations for appropriate typeface combinations for children reading on computers are discussed", "keyphrases": ["child reading performance", "computer-displayed text", "online text", "elementary school-age children", "computer-displayed typefaces", "fonts", "user interface", "human factors", "educational computing"]} -{"id": "1602", "title": "An optimization approach to plan for reusable software components", "abstract": "It is well acknowledged in software engineering that there is a great potential for accomplishing significant productivity improvements through the implementation of a successful software reuse program. On the other hand, such gains are attainable only by instituting detailed action plans at both the organizational and program level. Given this need, the paucity of research papers related to planning, and in particular, optimized planning is surprising. This research, which is aimed at this gap, brings out an application of optimization for the planning of reusable software components (SCs). We present a model that selects a set of SCs that must be built, in order to lower development and adaptation costs. We also provide implications to project management based on simulation, an approach that has been adopted by other cost models in the software engineering literature. Such a prescriptive model does not exist in the literature", "keyphrases": ["software engineering", "productivity improvements", "software reuse program", "optimization", "action plans", "optimized planning", "reusable software components", "adaptation costs", "development costs", "project management", "simulation"]} -{"id": "1929", "title": "Optimal time of switching between portfolios of securities", "abstract": "Optimal time of switching between several portfolios of securities are found for the purpose of profit maximization. Two methods of their determination are considered. The cases with three and n portfolios are studied in detail", "keyphrases": ["optimal time", "portfolios of securities", "profit maximization"]} -{"id": "156", "title": "Using extended logic programming for alarm-correlation in cellular phone networks", "abstract": "Alarm correlation is a necessity in large mobile phone networks, where the alarm bursts resulting from severe failures would otherwise overload the network operators. We describe how to realize alarm-correlation in cellular phone networks using extended logic programming. To this end, we describe an algorithm and system solving the problem, a model of a mobile phone network application, and a detailed solution for a specific scenario", "keyphrases": ["extended logic programming", "alarm-correlation", "cellular phone networks", "large mobile phone networks", "network operators", "fault diagnosis"]} -{"id": "1546", "title": "Necessary conditions of optimality for impulsive systems on Banach spaces", "abstract": "We present necessary conditions of optimality for optimal control problems arising in systems governed by impulsive evolution equations on Banach spaces. Basic notations and terminologies are first presented and necessary conditions of optimality are presented. Special cases are discussed and we present an application to the classical linear quadratic regulator problem", "keyphrases": ["linear quadratic regulator", "optimality", "impulsive systems", "optimal control", "impulsive evolution equations", "Banach spaces", "necessary conditions"]} -{"id": "1503", "title": "Neural networks for web content filtering", "abstract": "With the proliferation of harmful Internet content such as pornography, violence, and hate messages, effective content-filtering systems are essential. Many Web-filtering systems are commercially available, and potential users can download trial versions from the Internet. However, the techniques these systems use are insufficiently accurate and do not adapt well to the ever-changing Web. To solve this problem, we propose using artificial neural networks to classify Web pages during content filtering. We focus on blocking pornography because it is among the most prolific and harmful Web content. However, our general framework is adaptable for filtering other objectionable Web material", "keyphrases": ["artificial neural networks", "Intelligent Classification Engine", "learning capabilities", "pornographic/nonpornographic Web page differentiation", "Web content filtering", "violence", "Web page classification", "harmful Web content"]} -{"id": "1687", "title": "Cleared for take-off [Hummingbird Enterprise]", "abstract": "A recent Gartner report identifies Hummingbird in the first wave of vendors as an early example of convergence in the 'smart enterprise suite' market. We spoke to Hummingbird's Marketing Director for Northern Europe", "keyphrases": ["smart enterprise suite", "Hummingbird Enterprise", "information content", "knowledge content", "collaboration"]} -{"id": "1914", "title": "Vacuum-compatible vibration isolation stack for an interferometric gravitational wave detector TAMA300", "abstract": "Interferometric gravitational wave detectors require a large degree of vibration isolation. For this purpose, a multilayer stack constructed of rubber and metal blocks is suitable, because it provides isolation in all degrees of freedom at once. In TAMA300, a 300 m interferometer in Japan, long-term dimensional stability and compatibility with an ultrahigh vacuum environment of about 10/sup -6/ Pa are also required. To keep the interferometer at its operating point despite ground strain and thermal drift of the isolation system, a thermal actuator was introduced. To prevent the high outgassing rate of the rubber from spoiling the vacuum, the rubber blocks were enclosed by gas-tight bellows. Using these techniques, we have successfully developed a three-layer stack which has a vibration isolation ratio of more than 10/sup 3/ at 300 Hz with control of drift and enough vacuum compatibility", "keyphrases": ["vibration isolation stack", "TAMA300 interferometer", "interferometric gravitational wave detectors", "rubber blocks", "multilayer stack", "metal blocks", "long-term dimensional stability", "ultrahigh vacuum environment", "operating point", "ground strain", "thermal drift", "thermal actuator", "gas-tight bellows", "rubber outgassing", "vacuum compatibility", "300 m", "10/sup -6/ Pa", "300 Hz"]} -{"id": "1767", "title": "Bivariate fractal interpolation functions on rectangular domains", "abstract": "Non-tensor product bivariate fractal interpolation functions defined on gridded rectangular domains are constructed. Linear spaces consisting of these functions are introduced. The relevant Lagrange interpolation problem is discussed. A negative result about the existence of affine fractal interpolation functions defined on such domains is obtained", "keyphrases": ["bivariate fractal interpolation functions", "rectangular domains", "gridded rectangular domains", "linear spaces", "Lagrange interpolation problem", "affine fractal interpolation functions"]} -{"id": "1809", "title": "Approach to adaptive neural net-based H/sub infinity / control design", "abstract": "An approach is investigated for the adaptive neural net-based H/sub infinity / control design of a class of nonlinear uncertain systems. In the proposed framework, two multilayer feedforward neural networks are constructed as an alternative to approximate the nonlinear system. The neural networks are piecewisely interpolated to generate a linear differential inclusion model by which a linear state feedback H/sub infinity / control law can be applied. An adaptive weight adjustment mechanism for the multilayer feedforward neural networks is developed to ensure H/sub infinity / regulation performance. It is shown that finding the control gain matrices can be transformed into a standard linear matrix inequality problem and solved via a developed recurrent neural network", "keyphrases": ["adaptive neural net-based H/sub infinity / control design", "nonlinear uncertain systems", "multilayer feedforward neural networks", "piecewise interpolation", "linear differential inclusion model", "linear state feedback", "control gain matrices", "linear matrix inequality problem", "recurrent neural network", "LMI"]} -{"id": "1466", "title": "Feldkamp-type image reconstruction from equiangular data", "abstract": "The cone-beam approach for image reconstruction attracts increasing attention in various applications, especially medical imaging. Previously, the traditional practical cone-beam reconstruction method, the Feldkamp algorithm, was generalized into the case of spiral/helical scanning loci with equispatial cone-beam projection data. In this paper, we formulated the generalized Feldkamp algorithm in the case of equiangular cone-beam projection data, and performed numerical simulation to evaluate the image quality. Because medical multi-slice/cone-beam CT scanners typically use equiangular projection data, our new formula may be useful in this area as a framework for further refinement and a benchmark for comparison", "keyphrases": ["Feldkamp-type image reconstruction", "equiangular data", "cone-beam approach", "medical imaging", "practical cone-beam reconstruction method", "spiral/helical scanning loci", "equispatial cone-beam projection data", "generalized Feldkamp algorithm", "equiangular cone-beam projection data", "numerical simulation", "image quality", "medical multi-slice/cone-beam CT scanners"]} -{"id": "1895", "title": "An algorithm combining neural networks with fundamental parameters", "abstract": "An algorithm combining neural networks with the fundamental parameters equations (NNFP) is proposed for making corrections for non-linear matrix effects in x-ray fluorescence analysis. In the algorithm, neural networks were applied to relate the concentrations of components to both the measured intensities and the relative theoretical intensities calculated by the fundamental parameter equations. The NNFP algorithm is compared with the classical theoretical correction models, including the fundamental parameters approach, the Lachance-Traill model, a hyperbolic function model and the COLA algorithm. For an alloy system with 15 measured elements, in most cases, the prediction errors of the NNFP algorithm are lower than those of the fundamental parameters approach, the Lachance-Traill model, the hyperbolic function model and the COLA algorithm separately. If there are the serious matrix effects, such as matrix effects among Cr, Fe and Ni, the NNFP algorithm generally decreased predictive errors as compared with the classical models, except for the case of Cr by the fundamental parameters approach. The main reason why the NNFP algorithm has generally a better predictive ability than the classical theoretical correction models might be that neural networks can better calibrate the non-linear matrix effects in a complex multivariate system", "keyphrases": ["algorithm", "neural networks", "fundamental parameters", "fundamental parameters equations", "nonlinear matrix effects", "x-ray fluorescence analysis", "intensities", "NNFP algorithm", "theoretical correction models", "Lachance-Traill model", "hyperbolic function model", "COLA algorithm", "alloy system", "Cr", "Fe", "Ni", "complex multivariate system"]} -{"id": "1868", "title": "Estimation of an N-L-N Hammerstein-Wiener model", "abstract": "Estimation of a single-input single-output block-oriented model is studied. The model consists of a linear block embedded between two static nonlinear gains. Hence, it is called an N-L-N Hammerstein-Wiener model. First, the model structure is motivated and the disturbance model is discussed. The paper then concentrates on parameter estimation. A relaxation iteration scheme is proposed by making use of a model structure in which the error is bilinear-in-parameters. This leads to a simple algorithm which minimizes the original loss function. The convergence and consistency of the algorithm are studied. In order to reduce the variance error, the obtained linear model is further reduced using frequency weighted model reduction. A simulation study is used to illustrate the method", "keyphrases": ["N-L-N Hammerstein-Wiener model", "single-input single-output block-oriented model", "linear block", "static nonlinear gains", "model structure", "disturbance model", "parameter estimation", "relaxation iteration scheme", "bilinear-in-parameters error", "convergence", "consistency", "variance error", "frequency weighted model reduction", "nonlinear process"]} -{"id": "1706", "title": "Quantitative analysis of reconstructed 3-D coronary arterial tree and intracoronary devices", "abstract": "Traditional quantitative coronary angiography is performed on two-dimensional (2-D) projection views. These views are chosen by the angiographer to minimize vessel overlap and foreshortening. With 2-D projection views that are acquired in this nonstandardized fashion, however, there is no way to know or estimate how much error occurs in the QCA process. Furthermore, coronary arteries possess a curvilinear shape and undergo a cyclical deformation due to their attachment to the myocardium. Therefore, it is necessary to obtain three-dimensional (3-D) information to best describe and quantify the dynamic curvilinear nature of the human coronary artery. Using a patient-specific 3-D coronary reconstruction algorithm and routine angiographic images, a new technique is proposed to describe: (1) the curvilinear nature of 3-D coronary arteries and intracoronary devices; (2) the magnitude of the arterial deformation caused by intracoronary devices and due to heart motion; and (3) optimal view(s) with respect to the desired \"pathway\" for delivering intracoronary devices", "keyphrases": ["medical diagnostic imaging", "cyclical deformation", "myocardium", "dynamic curvilinear nature quantification", "patient-specific 3-D coronary reconstruction algorithm", "routine angiographic images", "arterial deformation magnitude", "intracoronary devices delivery pathway", "human coronary artery"]} -{"id": "1743", "title": "Adaptive stabilization of undamped flexible structures", "abstract": "In the paper non-identifier-based adaptive stabilization of undamped flexible structures is considered in the case of collocated input and output operators. The systems have poles and zeros on the imaginary axis. In the case where velocity feedback is available, the adaptive stabilizer is constructed by an adaptive PD-controller (proportional plus derivative controller). In the case where only position feedback is available, the adaptive stabilizer is constructed by an adaptive P-controller for the augmented system which consists of the controlled system and a parallel compensator. Numerical examples are given to illustrate the effectiveness of the proposed controllers", "keyphrases": ["adaptive stabilization", "undamped flexible structures", "poles and zeros", "imaginary axis", "velocity feedback", "adaptive PD-controller", "proportional plus derivative controller", "position feedback", "adaptive P-controller", "augmented system", "parallel compensator"]} -{"id": "1855", "title": "Distribution software: ROI is king", "abstract": "Middle-market accounting software vendors are taking to the open road, by way of souped-up distribution suites that can track product as it wends its way from warehouse floor to customer site. Integration provides efficiencies, and cost savings", "keyphrases": ["accounting software", "warehouse management", "distribution"]} -{"id": "1810", "title": "Input-output based pole-placement controller for a class of time-delay systems", "abstract": "A controller structure valid for SISO plants involving both internal and external point delays is presented. The control signal is based only on the input and output plant signals. The controller allows finite or infinite spectrum assignment. The most important feature of the proposed controller is that it only involves the use of a class of point-delayed signals. Thus the controller synthesis involves less computational cost than former methods. Since the plant control input is generated by filtering the input and output plant signals, this controller structure is potentially applicable to the adaptive case of unknown plant parameters", "keyphrases": ["I/O-based pole-placement controller", "input-output based pole-placement controller", "time-delay systems", "SISO plants", "internal point delays", "and external point delays", "finite spectrum assignment", "infinite spectrum assignment", "point-delayed signals", "controller synthesis", "computational cost", "filtering"]} -{"id": "1482", "title": "A parareal in time procedure for the control of partial differential equations", "abstract": "We have proposed in a previous note a time discretization for partial differential evolution equation that allows for parallel implementations. This scheme is here reinterpreted as a preconditioning procedure on an algebraic setting of the time discretization. This allows for extending the parallel methodology to the problem of optimal control for partial differential equations. We report a first numerical implementation that reveals a large interest", "keyphrases": ["time procedure", "partial differential equation control", "evolution equation", "preconditioning procedure", "Hilbert space", "algebraic setting", "time discretization", "optimal control"]} -{"id": "1783", "title": "Becoming a chief librarian: an analysis of transition stages in academic library leadership", "abstract": "The author explores how the four-part model of transition cycles identified by Nicholson and West (1988) applies to becoming a chief librarian of an academic library. The four stages: preparation, encounter, adjustment, and stabilization, are considered from the micro-, mezzo-, and macrolevels of the organization, as well as for their psychological and social impact on the new job incumbent. An instrument for assessment of transitional success which could be administered in the adjustment or stabilization stage is considered", "keyphrases": ["chief librarian", "transition stages", "academic library leadership", "organization", "psychological impact", "social impact", "job", "transition cycles model"]} -{"id": "172", "title": "A VMEbus interface for multi-detector trigger and control system", "abstract": "MUSE (MUltiplicity SElector) is the trigger and control system of CHIMERA, a 4 pi charged particle detector. Initialization of MUSE can be performed via VMEbus. This paper describes the design of VMEbus interface and functional module in MUSE, and briefly discusses an application of MUSE", "keyphrases": ["VMEbus interface", "MUSE", "CHIMERA", "trigger system", "control system"]} -{"id": "1562", "title": "Solution of a class of two-dimensional integral equations", "abstract": "The two-dimensional integral equation 1/ pi integral integral /sub D/( phi (r, theta )/R/sup 2/)dS=f(r/sub 0/, theta /sub 0/) defined on a circular disk D: r/sub 0/or=5, n is not a multiple of 3 and (h, n)=1, where h is the class number of the filed Q( square root (-q)), then the diophantine equation x/sup 2/+q/sup 2k+1/=y/sup n/ has exactly two families of solutions (q, n, k, x, y)", "keyphrases": ["diophantine equation", "odd prime", "odd integer", "Lucas sequence", "primitive divisors"]} -{"id": "1713", "title": "A uniform framework for regulating service access and information release on the Web", "abstract": "The widespread use of Internet-based services is increasing the amount of information (such as user profiles) that clients are required to disclose. This information demand is necessary for regulating access to services, and functionally convenient (e.g., to support service customization), but it has raised privacy-related concerns which, if not addressed, may affect the users disposition to use network services. At the same time, servers need to regulate service access without disclosing entirely the details of their access control policy. There is therefore a pressing need for privacy-aware techniques to regulate access to services open to the network. We propose an approach for regulating service access and information disclosure on the Web. The approach consists of a uniform formal framework to formulate - and reason about - both service access and information disclosure constraints. It also provides a means for parties to communicate their requirements while ensuring that no private information be disclosed and that the communicated requirements are correct with respect to the constraints", "keyphrases": ["service access regulation", "information release", "WWW", "Internet", "user profiles", "information demand", "client server systems", "access control policy", "privacy-aware techniques", "network services", "information disclosure", "uniform formal framework", "reasoning"]} -{"id": "1925", "title": "On the accuracy of polynomial interpolation in Hilbert space with disturbed nodal values of the operator", "abstract": "The interpolation accuracy of polynomial operators in a Hilbert space with a measure is estimated when nodal values of these operators are given approximately", "keyphrases": ["polynomial interpolation", "Hilbert space", "disturbed nodal values", "polynomial operators"]} -{"id": "1754", "title": "Coordination [crisis management]", "abstract": "Communications during a crisis, both internal and external, set the tone during response and carry a message through recovery. The authors describe how to set up a system for information coordination to make sure the right people get the right message, and the organization stays in control", "keyphrases": ["crisis management", "communications process", "information coordination"]} -{"id": "1711", "title": "Developing a CD-ROM as a teaching and learning tool in food and beverage management: a case study in hospitality education", "abstract": "Food and beverage management is the traditional core of hospitality education but, in its laboratory manifestation, has come under increasing pressure in recent years. It is an area that, arguably, presents the greatest challenges in adaptation to contemporary learning technologies but, at the same time, stands to benefit most from the potential of the Web. This paper addresses the design and development of a CD-ROM learning resource for food and beverage. It is a learning resource which is designed to integrate with rather than to replace existing conventional classroom and laboratory learning methods and, thus, compensate for the decline in the resource base faced in food and beverage education in recent years. The paper includes illustrative material drawn from the CD-ROM which demonstrates its use in teaching and learning", "keyphrases": ["food and beverage management", "hospitality education", "CD-ROM", "learning tool", "teaching tool"]} -{"id": "1882", "title": "Bandwidth vs. gains design of H/sub infinity / tracking controllers for current-fed induction motors", "abstract": "Describes a systematic procedure for designing speed and rotor flux norm tracking H/sub infinity /. controllers with unknown load torque disturbances for current-fed induction motors. A new effective design tool is developed to allow selection of the control gains so as to adjust the disturbances' rejection capability of the controllers in the face of the bandwidth requirements of the closed-loop system. Application of the proposed design procedure is demonstrated in a case study, and the results of numerical simulations illustrate the satisfactory performance achievable even in presence of rotor resistance uncertainty", "keyphrases": ["H/sub infinity / tracking controllers", "current-fed induction motors", "speed controllers", "rotor flux norm controllers", "unknown load torque disturbances", "design tool", "disturbances rejection capability", "bandwidth requirements", "closed-loop system", "feedback linearization", "observers"]} -{"id": "1548", "title": "A second order characteristic finite element scheme for convection-diffusion problems", "abstract": "A new characteristic finite element scheme is presented for convection-diffusion problems. It is of second order accuracy in time increment, symmetric, and unconditionally stable. Optimal error estimates are proved in the framework of L/sup 2/-theory. Numerical results are presented for two examples, which show the advantage of the scheme", "keyphrases": ["second order characteristic finite element scheme", "convection-diffusion problems", "second order accuracy", "optimal error estimates", "L/sup 2/ -theory"]} -{"id": "158", "title": "Neural and neuro-fuzzy integration in a knowledge-based system for air quality prediction", "abstract": "We propose a unified approach for integrating implicit and explicit knowledge in neurosymbolic systems as a combination of neural and neuro-fuzzy modules. In the developed hybrid system, a training data set is used for building neuro-fuzzy modules, and represents implicit domain knowledge. The explicit domain knowledge on the other hand is represented by fuzzy rules, which are directly mapped into equivalent neural structures. The aim of this approach is to improve the abilities of modular neural structures, which are based on incomplete learning data sets, since the knowledge acquired from human experts is taken into account for adapting the general neural architecture. Three methods to combine the explicit and implicit knowledge modules are proposed. The techniques used to extract fuzzy rules from neural implicit knowledge modules are described. These techniques improve the structure and the behavior of the entire system. The proposed methodology has been applied in the field of air quality prediction with very encouraging results. These experiments show that the method is worth further investigation", "keyphrases": ["neuro-fuzzy integration", "knowledge-based system", "air quality prediction", "neurosymbolic systems", "hybrid system", "training data set", "implicit domain knowledge representation", "fuzzy rules", "incomplete learning", "neural architecture", "experiments", "air pollution"]} -{"id": "1649", "title": "Office essentials [stationery suppliers]", "abstract": "Make purchasing stationery a relatively simple task through effective planning and management of stock, and identifying the right supplier", "keyphrases": ["stationery suppliers", "purchasing", "planning", "management of stock"]} -{"id": "1927", "title": "Optimal strategies for a semi-Markovian inventory system", "abstract": "Control for a semi-Markovian inventory system is considered. Under general assumptions on system functioning, conditions for existence of an optimal nonrandomized Markovian strategy are found. It is shown that under some additional assumptions on storing conditions for the inventory, the optimal strategy has a threshold (s, S)-frame", "keyphrases": ["optimal strategies", "semi-Markovian inventory system", "system functioning", "optimal nonrandomized Markovian strategy", "optimal strategy"]} -{"id": "1631", "title": "Recovering lost efficiency of exponentiation algorithms on smart cards", "abstract": "At the RSA cryptosystem implementation stage, a major security concern is resistance against so-called side-channel attacks. Solutions are known but they increase the overall complexity by a non-negligible factor (typically, a protected RSA exponentiation is 133% slower). For the first time, protected solutions are proposed that do not penalise the running time of an exponentiation", "keyphrases": ["smart cards", "exponentiation algorithms", "RSA cryptosystem implementation stage", "security", "side-channel attack resistance", "public-key encryption"]} -{"id": "1674", "title": "A column generation approach to delivery planning over time with inhomogeneous service providers and service interval constraints", "abstract": "We consider a problem of delivery planning over multiple time periods. Deliveries must be made to customers having nominated demand in each time period. Demand must be met in each time period by use of some combination of inhomogeneous service providers. Each service provider has a different delivery capacity, different cost of delivery to each customer, a different utilisation requirement, and different rules governing the spread of deliveries in time. The problem is to plan deliveries so as to minimise overall costs, subject to demand being met and service rules obeyed. A natural integer programming model was found to be intractable, except on problems with loose demand constraints, with gaps between best lower bound and best feasible solution of up to 35.1%, with an average of 15.4% over the test data set. In all but the problem with loosest demand constraints, Cplex 6.5 applied to this formulation failed to find the optimal solution before running out of memory. However a column generation approach improved the lower bound by between 0.6% and 21.9%, with an average of 9.9%, and in all cases found the optimal solution at the root node, without requiring branching", "keyphrases": ["column generation approach", "delivery planning over time", "inhomogeneous service providers", "service interval constraints", "delivery capacity", "lower bound", "transportation"]} -{"id": "1588", "title": "Contentment management", "abstract": "Andersen's William Yarker and Richard Young outline the route to a successful content management strategy", "keyphrases": ["Andersen Consulting", "content management strategy"]} -{"id": "1530", "title": "Uniform supersaturated design and its construction", "abstract": "Supersaturated designs are factorial designs in which the number of main effects is greater than the number of experimental runs. In this paper, a discrete discrepancy is proposed as a measure of uniformity for supersaturated designs, and a lower bound of this discrepancy is obtained as,a benchmark of design uniformity. A construction method for uniform supersaturated designs via resolvable balanced incomplete block designs is also presented along with the investigation of properties of the resulting designs. The construction method shows a strong link between these two different kinds of designs", "keyphrases": ["uniform supersaturated design", "factorial designs", "experimental runs", "discrete discrepancy", "resolvable balanced incomplete block designs"]} -{"id": "165", "title": "Monitoring the news online", "abstract": "The author looks at how we can focus on what we want, finding small stories in vast oceans of news. There is no one tool that will scan every news resource available and give alerts on new available materials. Every one has a slightly different focus. Some are paid sources, while many are free. If used wisely, an excellent news monitoring system for a large number of topics can be set up for surprisingly little cost", "keyphrases": ["news monitoring", "online news", "Internet"]} -{"id": "1468", "title": "Developing Web-enhanced learning for information fluency-a liberal arts college's perspective", "abstract": "Learning is likely to take a new form in the twenty-first century, and a transformation is already in process. Under the framework of information fluency, efforts are being made at Rollins College to develop a Web-enhanced course that encompasses information literacy, basic computer literacy, and critical thinking skills. Computer-based education can be successful when librarians use technology effectively to enhance their integrated library teaching. In an online learning environment, students choose a time for learning that best suits their needs and motivational levels. They can learn at their own pace, take a nonlinear approach to the subject, and maintain constant communication with instructors and other students. The quality of a technology-facilitated course can be upheld if the educational objectives and methods for achieving those objectives are carefully planned and explored", "keyphrases": ["Web-enhanced learning", "information fluency", "liberal arts college", "information literacy", "computer literacy", "critical thinking skills", "computer-based education", "librarians", "integrated library teaching", "online learning"]} -{"id": "1794", "title": "Well-posed anisotropic diffusion for image denoising", "abstract": "A nonlinear iterative smoothing filter based on a second-order partial differential equation is introduced. It smooths out the image according to an anisotropic diffusion process. The approach is based on a smooth approximation of the total variation (TV) functional which overcomes the non-differentiability of the TV functional at the origin. In particular, the authors perform linear smoothing over smooth areas but selective smoothing over candidate edges. By relating the smoothing parameter to the time step, they arrive at a CFL condition which guarantees the causality of the discrete scheme. This allows the adoption of higher time discretisation steps, while ensuring the absence of artefacts deriving from the non-smooth behaviour of the TV functional at the origin. In particular, it is shown that the proposed approach avoids the typical staircase effects in smooth areas which occur in the standard time-marching TV scheme", "keyphrases": ["image denoising", "well-posed anisotropic diffusion", "nonlinear iterative smoothing filter", "second-order partial differential equation", "total variation functional", "linear smoothing", "selective smoothing", "CFL condition", "discrete scheme", "causality", "higher time discretisation steps", "image restoration problem", "random Gaussian noise"]} -{"id": "1769", "title": "Transformation rules and strategies for functional-logic programs", "abstract": "This paper abstracts the contents of a PhD dissertation entitled 'Transformation Rules and Strategies for Functional-Logic Programs' which has been recently defended. These techniques are based on fold/unfold transformations and they can be used to optimize integrated (functional-logic) programs for a wide class of applications. Experimental results show that typical examples in the field of artificial intelligence are successfully enhanced by our transformation system SYNTH. The thesis presents the first approach of these methods for declarative languages that integrate the best features from functional and logic programming", "keyphrases": ["program transformation rules", "functional-logic programs", "logic programming", "functional programming", "fold-unfold transformations", "experimental results", "artificial intelligence", "SYNTH", "declarative languages"]} -{"id": "1807", "title": "Regional flux target with minimum energy", "abstract": "An extension of a gradient controllability problem to the case where the target subregion is a part of the boundary of a parabolic system domain is discussed. A definition and some properties adapted to this case are presented. The focus is on the characterisation of the control achieving a regional boundary gradient target with minimum energy. An approach is developed that leads to a numerical algorithm for the computation of optimal control. Numerical illustrations show the efficiency of the approach and lead to conjectures", "keyphrases": ["regional flux target", "minimum energy", "gradient controllability problem", "target subregion", "parabolic system domain boundary", "regional boundary gradient target", "numerical algorithm", "optimal control"]} -{"id": "1495", "title": "Laptops zip to 2 GHz-plus", "abstract": "Intel's Pentium 4-M processor has reached the coveted 2-GHz mark, and speed-hungry mobile users will be tempted to buy a laptop with the chip. However, while our exclusive tests found 2-GHz P4-M notebooks among the fastest units we've tested, the new models failed to make dramatic gains compared with those based on Intel's 1.8-GHz mobile chip. Since 2-GHz notebooks carry a hefty price premium, buyers seeking both good performance and a good price might prefer a 1.8-GHz unit instead", "keyphrases": ["Intel Pentium 4-M processor", "mobile", "laptop", "notebooks", "2 GHz"]} -{"id": "1842", "title": "The role of B2B engines in B2B integration architectures", "abstract": "Semantic B2B integration architectures must enable enterprises to communicate standards-based B2B events like purchase orders with any potential trading partner. This requires not only back end application integration capabilities to integrate with e.g. enterprise resource planning (ERP) systems as the company-internal source and destination of B2B events, but also a capability to implement every necessary B2B protocol like electronic data interchange (EDI), RosettaNet as well as more generic capabilities like Web services (WS). This paper shows the placement and functionality of B2B engines in semantic B2B integration architectures that implement a generic framework for modeling and executing any B2B protocol. A detailed discussion shows how a B2B engine can provide the necessary abstractions to implement any standard-based B2B protocol or any trading partner specific specialization", "keyphrases": ["B2B engines", "semantic B2B integration architectures", "standards-based B2B event communication", "purchase orders", "trading partner", "ERP systems", "EDI", "RosettaNet", "Web services", "modeling"]} -{"id": "1514", "title": "Universal parametrization in constructing smoothly-connected B-spline surfaces", "abstract": "In this paper, we explore the feasibility of universal parametrization in generating B-spline surfaces, which was proposed recently in the literature (Lim, 1999). We present an interesting property of the new parametrization that it guarantees Go continuity on B-spline surfaces when several independently constructed patches are put together without imposing any constraints. Also, a simple blending method of patchwork is proposed to construct C/sup n-1/ surfaces, where overlapping control nets are utilized. It takes into account the semi-localness property of universal parametrization. It effectively helps us construct very natural looking B-spline surfaces while keeping the deviation from given data points very low. Experimental results are shown with several sets of surface data points", "keyphrases": ["universal parametrization", "smoothly-connected B-spline surface generation", "G/sup 0/ continuity", "patches", "patchwork blending method", "C/sup n-1/ surfaces", "overlapping control nets", "semi-localness property", "surface data points"]} -{"id": "1551", "title": "The numerical solution of an evolution problem of second order in time on a closed smooth boundary", "abstract": "We consider an initial value problem for the second-order differential equation with a Dirichlet-to-Neumann operator coefficient. For the numerical solution we carry out semi-discretization by the Laguerre transformation with respect to the time variable. Then an infinite system of the stationary operator equations is obtained. By potential theory, the operator equations are reduced to boundary integral equations of the second kind with logarithmic or hypersingular kernels. The full discretization is realized by Nystrom's method which is based on the trigonometric quadrature rules. Numerical tests confirm the ability of the method to solve these types of nonstationary problems", "keyphrases": ["initial value problem", "second-order differential equation", "evolution problem", "closed smooth boundary", "Laguerre transformation", "hypersingular kernels", "boundary integral equations", "stationary operator equations"]} -{"id": "1615", "title": "Laguerre approximation of fractional systems", "abstract": "Systems characterised by fractional power poles can be called fractional systems. Here, Laguerre orthogonal polynomials are employed to approximate fractional systems by minimum phase, reduced order, rational transfer functions. Both the time and the frequency-domain analysis exhibit the accuracy of the approximation", "keyphrases": ["Laguerre approximation", "fractional systems", "fractional power poles", "orthogonal polynomials", "minimum phase", "reduced order", "robust controllers", "closed-loop system", "rational transfer functions", "frequency-domain analysis", "time-domain analysis"]} -{"id": "1650", "title": "Low to mid-speed copiers [buyer's guide]", "abstract": "The low to mid-speed copier market is being transformed by the almost universal adoption of digital solutions. The days of the analogue copier are numbered as the remaining vendors plan to withdraw from this sector by 2005. Reflecting the growing market for digital, vendors are reducing prices, making a digital solution much more affordable. The battle for the copier market is intense, and the popularity of the multifunctional device is going to transform the office equipment market. As total cost of ownership becomes increasingly important and as budgets are squeezed, the most cost-effective solutions are those that will survive this shake-down", "keyphrases": ["low to mid-speed copier market", "total cost of ownership"]} -{"id": "1823", "title": "Single-phase shunt active power filter with harmonic detection", "abstract": "An advanced active power filter for the compensation of instantaneous harmonic current components in nonlinear current loads is presented. A signal processing technique using an adaptive neural network algorithm is applied for the detection of harmonic components generated by nonlinear current loads and it can efficiently determine the instantaneous harmonic components in real time. The validity of this active filtering processing system to compensate current harmonics is substantiated by simulation results", "keyphrases": ["single-phase shunt active power filter", "harmonic detection", "instantaneous harmonic current components compensation", "nonlinear current loads", "signal processing technique", "adaptive neural network algorithm", "instantaneous harmonic components", "simulation"]} -{"id": "1866", "title": "Tracking with sensor failures", "abstract": "Studies the reliability with sensor failures of the asymptotic tracking problem for linear time invariant systems using the factorization approach. The plant is two-output and the compensator is two-degree-of-freedom. Necessary and sufficient conditions are presented for the general problem and a simple solution is given for problems with stable plants", "keyphrases": ["sensor failures", "reliability", "asymptotic tracking problem", "linear time invariant systems", "factorization approach", "two-output plant", "two-degree-of-freedom compensator", "necessary and sufficient conditions"]} -{"id": "1708", "title": "A study of hospitality and tourism information technology education and industrial applications", "abstract": "The purpose of this study was to examine the subject relevance of information technology (IT) in hospitality and tourism management programs with skills deployed in the workplace. This study aimed at investigating graduates' transition from education to employment, and to determine how well they appear to be equipped to meet the needs of the hospitality and tourism industry. One hundred and seventeen graduates responded to a mail survey. These graduates rated the importance of IT skills in the workplace, the level of IT teaching in hotel and tourism management programs, and the self-competence level in IT. This study concluded that a gap exists between the IT skills required at work and those acquired at university", "keyphrases": ["hospitality and tourism management programs", "education", "employment", "hospitality industry", "tourism industry", "mail survey", "graduates", "IT skills", "university", "IT teaching"]} -{"id": "1471", "title": "E-commerce-resources for doing business on the Internet", "abstract": "There are many different types of e-commerce depending upon who or what is selling and who or what is buying. In addition, e-commerce is more than an exchange of funds and goods or services, it encompasses an entire infrastructure of services, computer hardware and software products, technologies, and communications formats. The paper discusses e-commerce terminology, types and information resources, including books and Web sites", "keyphrases": ["business", "Internet", "e-commerce", "terminology", "information resources", "books", "Web sites"]} -{"id": "1770", "title": "New developments in inductive learning", "abstract": "Any intelligent system, whether natural or artificial, must have three characteristics: knowledge, reasoning, and learning. Artificial intelligence (AI) studies these three aspects in artificial systems. Briefly, we could say that knowledge refers to the system's world model, and reasoning to the manipulation of this knowledge. Learning is slightly more complex; the system interacts with the world and as a consequence it builds onto and modifies its knowledge. This process of self-building and self-modifying is known as learning. This thesis is set within the field of artificial intelligence and focuses on learning. More specifically, it deals with the inductive learning of decision trees", "keyphrases": ["inductive learning", "new developments", "intelligent system", "knowledge", "reasoning", "artificial intelligence", "decision trees"]} -{"id": "1735", "title": "Mid-market accounting systems", "abstract": "Welcome to our fourth annual survey of accounting systems and enterprise resource planning (ERP) systems. Last September, we concentrated on financial and distribution systems for medium-sized businesses (mid market) and included 22 products in our charts. This year, we extended the products to include manufacturing and added 34 products to the list", "keyphrases": ["mid-market accounting systems", "survey", "enterprise resource planning", "manufacturing"]} -{"id": "181", "title": "Electromagnetics computations using the MPI parallel implementation of the steepest descent fast multipole method (SDFMM)", "abstract": "The computational solution of large-scale linear systems of equations necessitates the use of fast algorithms but is also greatly enhanced by employing parallelization techniques. The objective of this work is to demonstrate the speedup achieved by the MPI (message passing interface) parallel implementation of the steepest descent fast multipole method (SDFMM). Although this algorithm has already been optimized to take advantage of the structure of the physics of scattering problems, there is still the opportunity to speed up the calculation by dividing tasks into components using multiple processors and solve them in parallel. The SDFMM has three bottlenecks ordered as (1) filling the sparse impedance matrix associated with the near-field method of moments interactions (MoM), (2) the matrix vector multiplications associated with this sparse matrix and (3) the far field interactions associated with the fast multipole method. The parallel implementation task is accomplished using a thirty-one node Intel Pentium Beowulf cluster and is also validated on a 4-processor Alpha workstation. The Beowulf cluster consists of thirty-one nodes of 350 MHz Intel Pentium IIs with 256 MB of RAM and one node of a 4*450 MHz Intel Pentium II Xeon shared memory processor with 2 GB of RAM with all nodes connected to a 100 BaseTX Ethernet network. The Alpha workstation has a maximum of four 667 MHz processors. Our numerical results show significant linear speedup in filling the sparse impedance matrix. Using the 32-processors on the Beowulf cluster lead to a 7.2 overall speedup while a 2.5 overall speedup is gained using the 4-processors on the Alpha workstation", "keyphrases": ["electromagnetics computations", "MPI parallel implementation", "steepest descent fast multipole method", "large-scale linear systems", "fast algorithms", "message passing interface", "physics", "multiple processors", "sparse impedance matrix", "near-field MoM", "method of moments", "scattering problems", "matrix vector multiplications", "Intel Pentium Beowulf cluster", "4-processor Alpha workstation", "Intel Pentium II", "RAM", "Xeon shared memory processor", "100 BaseTX Ethernet network", "scattered electric field", "scattered magnetic field", "350 MHz", "256 MByte", "450 MHz", "2 GByte", "667 MHz"]} -{"id": "1903", "title": "The BLISS programming language: a history", "abstract": "The BLISS programming language was invented by William A. Wulf and others at Carnegie-Mellon University in 1969, originally for the DEC PDP-10. BLISS-10 caught the interest of Ronald F. Brender of DEC (Digital Equipment Corporation). After several years of collaboration, including the creation of BLISS-11 for the PDP-11, BLISS was adopted as DEC's implementation language for use on its new line of VAX computers in 1975. DEC developed a completely new generation of BLISSs for the VAX, PDP-10 and PDP-11, which became widely used at DEC during the 1970s and 1980s. With the creation of the Alpha architecture in the early 1990s, BLISS was extended again, in both 32- and 64-bit flavors. BLISS support for the Intel IA-32 architecture was introduced in 1995 and IA-64 support is now in progress. BLISS has a number of unusual characteristics: it is typeless, requires use of an explicit contents of operator (written as a period or 'dot'), takes an algorithmic approach to data structure definition, has no goto, is an expression language, and has an unusually rich compile-time language. This paper reviews the evolution and use of BLISS over its three decade lifetime. Emphasis is on how the language evolved to facilitate portable programming while retaining its initial highly machine-specific character. Finally, the success of its characteristics are assessed", "keyphrases": ["BLISS programming language", "machine-oriented language", "portable programming", "system implementation language", "data structure definition", "compile-time language"]} -{"id": "1591", "title": "Quadratic interpolation on spheres", "abstract": "Riemannian quadratics are C/sup 1/ curves on Riemannian manifolds, obtained by performing the quadratic recursive deCastlejeau algorithm in a Riemannian setting. They are of interest for interpolation problems in Riemannian manifolds, such as trajectory-planning for rigid body motion. Some interpolation properties of Riemannian quadratics are analysed when the ambient manifold is a sphere or projective space, with the usual Riemannian metrics", "keyphrases": ["quadratic interpolation", "Riemannian manifolds", "trajectory-planning", "rigid body motion", "ambient manifold", "corner-cutting", "parallel translation", "approximation theory"]} -{"id": "1628", "title": "Quasi-Newton algorithm for adaptive minor component extraction", "abstract": "An adaptive quasi-Newton algorithm is first developed to extract a single minor component corresponding to the smallest eigenvalue of a stationary sample covariance matrix. A deflation technique instead of the commonly used inflation method is then applied to extract the higher-order minor components. The algorithm enjoys the advantage of having a simpler computational complexity and a highly modular and parallel structure for efficient implementation. Simulation results are given to demonstrate the effectiveness of the proposed algorithm for extracting multiple minor components adaptively", "keyphrases": ["quasi-Newton algorithm", "adaptive minor component extraction", "eigenvalue", "stationary sample covariance matrix", "deflation technique", "higher-order minor components", "computational complexity", "modular structure", "parallel structure", "simulation results", "adaptive estimation", "DOA estimation", "ROOT-MUSIC estimator"]} -{"id": "1529", "title": "Quantized-State Systems: A DEVS-approach for continuous system simulation", "abstract": "A new class of dynamical systems, Quantized State Systems or QSS, is introduced in this paper. QSS are continuous time systems where the input trajectories are piecewise constant functions and the state variable trajectories - being themselves piecewise linear functions - are converted into piecewise constant functions via a quantization function equipped with hysteresis. It is shown that QSS can be exactly represented and simulated by a discrete event model, within the framework of the DEVS-approach. Further, it is shown that QSS can be used to approximate continuous systems, thus allowing their discrete-event simulation in opposition to the classical discrete-time simulation. It is also shown that in an approximating QSS, some stability properties of the original system are conserved and the solutions of the QSS go to the solutions of the original system when the quantization goes to zero", "keyphrases": ["dynamical systems", "Quantized State Systems", "continuous time systems", "piecewise constant functions", "discrete event model", "discrete-event simulation"]} -{"id": "1827", "title": "Gossip is synteny: Incomplete gossip and the syntenic distance between genomes", "abstract": "The syntenic distance between two genomes is given by the minimum number of fusions, fissions, and translocations required to transform one into the other, ignoring the order of genes within chromosomes. Computing this distance is NP-hard. In the present work, we give a tight connection between syntenic distance and the incomplete gossip problem, a novel generalization of the classical gossip problem. In this problem, there are n gossipers, each with a unique piece of initial information; they communicate by phone calls in which the two participants exchange all their information. The goal is to minimize the total number of phone calls necessary to inform each gossiper of his set of relevant gossip which he desires to learn. As an application of the connection between syntenic distance and incomplete gossip, we derive an O(2/sup O(n log n)/) algorithm to exactly compute the syntenic distance between two genomes with at most n chromosomes each. Our algorithm requires O(n/sup 2/+2/sup O(d log d)/) time when this distance is d, improving the O(n/sup 2/+2(O(d//sup 2/))) running time of the best previous exact algorithm", "keyphrases": ["syntenic distance", "genomes", "NP-hard", "incomplete gossip problem", "comparative genomics", "running time", "chromosomes"]} -{"id": "1862", "title": "Global comparison of stages of growth based on critical success factors", "abstract": "With increasing globalization of business, the management of IT in international organizations is faced with the complex task of dealing with the difference between local and international IT needs. This study evaluates, and compares, the level of IT maturity and the critical success factors (CSFs) in selected geographic regions, namely, Norway, Australia/New Zealand, North America, Europe, Asia/Pacific, and India. The results show that significant differences in the IT management needs in these geographic regions exist, and that the IT management operating in these regions must balance the multiple critical success factors for achieving an optimal local-global mix for business success", "keyphrases": ["business globalization", "IT management", "international IT needs", "local IT needs", "IT maturity", "critical success factors", "Norway", "Australia", "New Zealand", "North America", "Europe", "Asia/Pacific", "India", "optimal local-global mix", "business success"]} -{"id": "1749", "title": "Advanced aerostatic stability analysis of cable-stayed bridges using finite-element method", "abstract": "Based on the concept of limit point instability, an advanced nonlinear finite-element method that can be used to analyze the aerostatic stability of cable-stayed bridges is proposed. Both geometric nonlinearity and three components of wind loads are considered in this method. The example bridge is the second Santou Bay cable-stayed bridge with a main span length of 518 m built in China. Aerostatic stability of the example bridge is investigated using linear and proposed methods. The effect of pitch moment coefficient on the aerostatic stability of the bridge has been studied. The results show that the aerostatic instability analyses of cable-stayed bridges based on the linear method considerably overestimate the wind-resisting capacity of cable-stayed bridges. The proposed method is highly accurate and efficient. Pitch moment coefficient has a major effect on the aerostatic stability of cable-stayed bridges. Finally, the aerostatic failure mechanism of cable-stayed bridges is explained by tracing the aerostatic instability path", "keyphrases": ["limit point instability", "advanced nonlinear finite element method", "advanced aerostatic stability analysis", "cable-stayed bridges", "geometric nonlinearity", "wind loads", "Santou Bay cable-stayed bridge", "China", "pitch moment coefficient", "aerostatic failure mechanism"]} -{"id": "1611", "title": "Data mining business intelligence for competitive advantage", "abstract": "Organizations have lately realized that just processing transactions and/or information faster and more efficiently no longer provides them with a competitive advantage vis-a-vis their competitors for achieving business excellence. Information technology (IT) tools that are oriented towards knowledge processing can provide the edge that organizations need to survive and thrive in the current era of fierce competition. Enterprises are no longer satisfied with business information system(s); they require business intelligence system(s). The increasing competitive pressures and the desire to leverage information technology techniques have led many organizations to explore the benefits of new emerging technology, data warehousing and data mining. The paper discusses data warehouses and data mining tools and applications", "keyphrases": ["business intelligence", "competitive advantage", "organizations", "information technology", "knowledge processing", "business information system", "data warehouses", "data mining"]} -{"id": "1654", "title": "Numerical validation of solutions of complementarity problems: the nonlinear case", "abstract": "This paper proposes a validation method for solutions of nonlinear complementarity problems. The validation procedure performs a computational test. If the result of the test is positive, then it is guaranteed that a given multi-dimensional interval either includes a solution or excludes all solutions of the nonlinear complementarity problem", "keyphrases": ["numerical validation", "computational test", "nonlinear complementarity problem", "optimization"]} -{"id": "17", "title": "Fault diagnosis and fault tolerant control of linear stochastic systems with unknown inputs", "abstract": "This paper presents an integrated robust fault detection and isolation (FDI) and fault tolerant control (FTC) scheme for a fault in actuators or sensors of linear stochastic systems subjected to unknown inputs (disturbances). As usual in this kind of works, it is assumed that single fault occurs at a time and the fault treated is of random bias type. The FDI module is constructed using banks of robust two-stage Kalman filters, which simultaneously estimate the state and the fault bias, and generate residual sets decoupled from unknown disturbances. All elements of residual sets are evaluated by using a hypothesis statistical test, and the fault is declared according to the prepared decision logic. The FTC module is activated based on the fault indicator, and additive compensation signal is computed using the fault bias estimate and combined to the nominal control law for compensating the fault's effect on the system. Simulation results for the simplified longitudinal flight control system with parameter variations, process and measurement noises demonstrate the effectiveness of the approach proposed", "keyphrases": ["fault detection", "fault isolation", "fault tolerant control", "linear systems", "stochastic systems", "two-stage Kalman filters", "state estimation", "longitudinal flight control system", "robust control", "discrete-time system"]} -{"id": "1510", "title": "Estimation of the gradient of the solution of an adjoint diffusion equation by the Monte Carlo method", "abstract": "For the case of isotropic diffusion we consider the representation of the weighted concentration of trajectories and its space derivatives in the form of integrals (with some weights) of the solution to the corresponding boundary value problem and its directional derivative of a convective velocity. If the convective velocity at the domain boundary is degenerate and some other additional conditions are imposed this representation allows us to construct an efficient 'random walk by spheres and balls' algorithm. When these conditions are violated, transition to modelling the diffusion trajectories by the Euler scheme is realized, and the directional derivative of velocity is estimated by the dependent testing method, using the parallel modelling of two closely-spaced diffusion trajectories. We succeeded in justifying this method by statistically equivalent transition to modelling a single trajectory after the first step in the Euler scheme, using a suitable weight. This weight also admits direct differentiation with respect to the initial coordinate along a given direction. The resulting weight algorithm for calculating concentration derivatives is especially efficient if the initial point is in the subdomain in which the coefficients of the diffusion equation are constant", "keyphrases": ["isotropic diffusion", "weighted trajectory concentration", "space derivatives", "integrals", "boundary value problem", "directional derivative", "convective velocity", "domain boundary", "gradient estimation", "adjoint diffusion equation", "Monte Carlo method", "random walk by spheres and balls algorithm", "diffusion trajectories", "Euler scheme", "dependent testing method", "parallel modelling", "closely-spaced diffusion trajectories", "statistically equivalent transition", "weight", "direct differentiation", "initial coordinate", "concentration derivatives"]} -{"id": "1555", "title": "A note on multi-index polynomials of Dickson type and their applications in quantum optics", "abstract": "We discuss the properties of a new family of multi-index Lucas type polynomials, which are often encountered in problems of intracavity photon statistics. We develop an approach based on the integral representation method and show that this class of polynomials can be derived from recently introduced multi-index Hermite like polynomials", "keyphrases": ["Lucas type polynomials", "multi-index polynomials", "quantum optics", "intracavity photon statistics", "integral representation", "generating functions"]} -{"id": "1694", "title": "Product development: using a 3D computer model to optimize the stability of the Rocket TM powered wheelchair", "abstract": "A three-dimensional (3D) lumped-parameter model of a powered wheelchair was created to aid the development of the Rocket prototype wheelchair and to help explore the effect of innovative design features on its stability. The model was developed using simulation software, specifically Working Model 3D. The accuracy of the model was determined by comparing both its static stability angles and dynamic behavior as it passed down a 4.8-cm (1.9\") road curb at a heading of 45 degrees with the performance of the actual wheelchair. The model's predictions of the static stability angles in the forward, rearward, and lateral directions were within 9.3, 7.1, and 3.8% of the measured values, respectively. The average absolute error in the predicted position of the wheelchair as it moved down the curb was 2.2 cm/m (0.9\" per 3'3\") traveled. The accuracy was limited by the inability to model soft bodies, the inherent difficulties in modeling a statically indeterminate system, and the computing time. Nevertheless, it was found to be useful in investigating the effect of eight design alterations on the lateral stability of the wheelchair. Stability was quantified by determining the static lateral stability angles and the maximum height of a road curb over which the wheelchair could successfully drive on a diagonal heading. The model predicted that the stability was more dependent on the configuration of the suspension system than on the dimensions and weight distribution of the wheelchair. Furthermore, for the situations and design alterations studied, predicted improvements in static stability were not correlated with improvements in dynamic stability", "keyphrases": ["3D computer model", "product development", "innovative design features", "suspension system configuration", "dynamic stability improvements", "average absolute error", "predicted position", "soft bodies modeling", "statically indeterminate system", "computing time", "design alterations effect", "diagonal heading", "weight distribution", "Rocket TM powered wheelchair", "4.8 cm"]} -{"id": "1568", "title": "Natural language from artificial life", "abstract": "This article aims to show that linguistics, in particular the study of the lexico-syntactic aspects of language, provides fertile ground for artificial life modeling. A survey of the models that have been developed over the last decade and a half is presented to demonstrate that ALife techniques have a lot to offer an explanatory theory of language. It is argued that this is because much of the structure of language is determined by the interaction of three complex adaptive systems: learning, culture, and biological evolution. Computational simulation, informed by theoretical linguistics, is an appropriate response to the challenge of explaining real linguistic data in terms of the processes that underpin human language", "keyphrases": ["natural language", "linguistics", "lexico-syntactic aspects", "ALife", "adaptive systems", "learning", "culture", "biological evolution", "computational simulation", "artificial life"]} -{"id": "178", "title": "A parallelized indexing method for large-scale case-based reasoning", "abstract": "Case-based reasoning (CBR) is a problem solving methodology commonly seen in artificial intelligence. It can correctly take advantage of the situations and methods in former cases to find out suitable solutions for new problems. CBR must accurately retrieve similar prior cases for getting a good performance. In the past, many researchers proposed useful technologies to handle this problem. However, the performance of retrieving similar cases may be greatly influenced by the number of cases. In this paper, the performance issue of large-scale CBR is discussed and a parallelized indexing architecture is then proposed for efficiently retrieving similar cases in large-scale CBR. Several algorithms for implementing the proposed architecture are also described. Some experiments are made and the results show the efficiency of the proposed method", "keyphrases": ["parallelized indexing method", "large-scale case-based reasoning", "problem solving methodology", "artificial intelligence", "bitwise indexing", "similar prior case retrieval", "performance", "experiments"]} -{"id": "185", "title": "Property testers for dense Constraint Satisfaction programs on finite domains", "abstract": "Many NP-hard languages can be \"decided\" in subexponential time if the definition of \"decide\" is relaxed only slightly. Rubinfeld and Sudan introduced the notion of property testers, probabilistic algorithms that can decide, with high probability, if a function has a certain property or if it is far from any function having this property. Goldreich, Goldwasser, and Ron constructed property testers with constant query complexity for dense instances of a large class of graph problems. Since many graph problems can be viewed as special cases of the Constraint Satisfaction Problem on Boolean domains, it is natural to try to construct property testers for more general cases of the Constraint Satisfaction Problem. In this paper, we give explicit constructions of property testers using a constant number of queries for dense instances of Constraint Satisfaction Problems where the constraints have constant arity and the variables assume values in some domain of finite size", "keyphrases": ["NP-hard languages", "property testers", "probabilistic algorithms", "constant query complexity", "constraint satisfaction", "dense instances", "randomized sampling", "subexponential time", "graph problems", "Constraint Satisfaction Problem"]} -{"id": "1907", "title": "Multiple comparison methods for means", "abstract": "Multiple comparison methods (MCMs) are used to investigate differences between pairs of population means or, more generally, between subsets of population means using sample data. Although several such methods are commonly available in statistical software packages, users may be poorly informed about the appropriate method(s) to use and/or the correct way to interpret the results. This paper classifies the MCMs and presents the important methods for each class. Both simulated and real data are used to compare the methods, and emphasis is placed on a correct application and interpretation. We include suggestions for choosing the best method. Mathematica programs developed by the authors are used to compare MCMs. By taking the advantage of Mathematica's notebook structure, all interested student can use these programs to explore the subject more deeply", "keyphrases": ["multiple comparison procedures", "population means", "error rate", "single-step procedures", "step-down procedures", "sales management", "pack-age design"]} -{"id": "1595", "title": "Convergence of finite element approximations and multilevel linearization for Ginzburg-Landau model of d-wave superconductors", "abstract": "In this paper, we consider the finite element approximations of a recently proposed Ginzburg-Landau-type model for d-wave superconductors. In contrast to the conventional Ginzburg-Landau model the scalar complex valued order-parameter is replaced by a multicomponent complex order-parameter and the free energy is modified according to the d-wave paring symmetry. Convergence and optimal error estimates and some super-convergent estimates for the derivatives are derived. Furthermore, we propose a multilevel linearization procedure to solve the nonlinear systems. It is proved that the optimal error estimates and super-convergence for the derivatives are preserved by the multi-level linearization algorithm", "keyphrases": ["Ginzburg-Landau model", "d-wave", "superconductivity", "finite element method", "nonlinear systems", "error estimation", "two-grid method", "free energy", "multilevel linearization"]} -{"id": "1669", "title": "Supply chain optimisation in the paper industry", "abstract": "We describe the formulation and development of a supply-chain optimisation model for Fletcher Challenge Paper Australasia (FCPA). This model, known as the paper industry value optimisation tool (PIVOT), is a large mixed integer program that finds an optimal allocation of supplier to mill, product to paper machine, and paper machine to customer, while at the same time modelling many of the supply chain details and nuances which are peculiar to FCPA. PIVOT has assisted FCPA in solving a number of strategic and tactical decision problems, and provided significant economic benefits for the company", "keyphrases": ["supply chain optimisation", "Fletcher Challenge Paper Australasia", "paper industry value optimisation tool", "PIVOT", "large mixed integer program", "optimal allocation", "strategic decision problems", "tactical decision problems", "economic benefits"]} -{"id": "1488", "title": "Social presence in telemedicine", "abstract": "We studied consultations between a doctor, emergency nurse practitioners (ENPs) and their patients in a minor accident and treatment service (MATS). In the conventional consultations, all three people were located at the main hospital. In the teleconsultations, the doctor was located in a hospital 6 km away from the MATS and used a videoconferencing link connected at 384 kbit/s. There were 30 patients in the conventional group and 30 in the telemedical group. The presenting problems were similar in the two groups. The mean duration of teleconsultations was 951 s and the mean duration of face-to-face consultations was 247 s. In doctor-nurse communication there was a higher rate of turn taking in teleconsultations than in face-to-face consultations; there were also more interruptions, more words and more `backchannels' (e.g. `mhm', `uh-huh') per teleconsultation. In doctor-patient communication there was a higher rate of turn taking, more words, more interruptions and more backchannels per teleconsultation. In patient-nurse communication there was. relatively little difference between the two modes of consulting the doctor. Telemedicine appeared to empower the patient to ask more questions of the doctor. It also seemed that the doctor took greater care in a teleconsultation to achieve coordination of beliefs with the patient than in a face-to-face consultation", "keyphrases": ["social presence", "telemedicine", "doctor", "emergency nurse practitioners", "patients", "minor accident and treatment service", "teleconsultations", "videoconferencing link", "face-to-face consultations", "doctor-nurse communication", "interruptions", "backchannels", "words", "turn taking", "patient-nurse communication", "belief coordination", "384 kbit/s", "951 s", "247 s"]} -{"id": "1774", "title": "A work journal [librarianship]", "abstract": "Keeping a work journal can be useful in exploring one's thoughts and feelings about work challenges and work decisions. It can help bring about greater fulfillment in one's work life by facilitating self-renewal, change, the search for new meaning, and job satisfaction. One example of a work journal which I kept in 1998 is considered. It touches on several issues of potential interest to midlife career librarians including the challenge of technology, returning to work at midlife after raising a family, further education, professional writing, and job exchange", "keyphrases": ["work decisions", "work challenges", "job satisfaction", "self-renewal", "work journal", "change", "midlife career librarians", "technology", "further education", "professional writing", "job exchange"]} -{"id": "1731", "title": "Hit the road, Jack", "abstract": "Going freelance offers the potential of higher earnings, variety and independence - but also removes the benefits of permanent employment and can mean long distance travel and periods out of work. The author looks at the benefits and drawbacks - and how to get started as an IT contractor", "keyphrases": ["IT contractor", "freelance working"]} -{"id": "1789", "title": "Dousing terrorist funding: mission possible? [banks]", "abstract": "The government is tightening its grip on terrorist money flows. But as the banking industry continues to expand its Patriot Act compliance activities, it is with the realization that a great deal of work remains to be done before the American financial system can become truly airtight. Identification instruments, especially drivers licenses, represent a significant weak spot", "keyphrases": ["banking", "Patriot Act", "terrorist funding", "identification"]} -{"id": "1475", "title": "Relation between glare and driving performance", "abstract": "The present study investigated the effects of discomfort glare on driving behavior. Participants (old and young; US and Europeans) were exposed to a simulated low- beam light source mounted on the hood of an instrumented vehicle. Participants drove at night in actual traffic along a track consisting of urban, rural, and highway stretches. The results show that the relatively low glare source caused a significant drop in detecting simulated pedestrians along the roadside and made participants drive significantly slower on dark and winding roads. Older participants showed the largest drop in pedestrian detection performance and reduced their driving speed the most. The results indicate that the de Boer rating scale, the most commonly used rating scale for discomfort glare, is practically useless as a predictor of driving performance. Furthermore, the maximum US headlamp intensity (1380 cd per headlamp) appears to be an acceptable upper limit", "keyphrases": ["glare", "driving performance", "discomfort glare", "simulated low-beam light source", "road traffic", "urban road", "rural road", "highway", "deBoer rating scale"]} -{"id": "1608", "title": "A geometric process equivalent model for a multistate degenerative system", "abstract": "In this paper, a monotone process model for a one-component degenerative system with k+1 states (k failure states and one working state) is studied. We show that this model is equivalent to a geometric process (GP) model for a two-state one component system such that both systems have the same long-run average cost per unit time and the same optimal policy. Furthermore, an explicit expression for the determination of an optimal policy is derived", "keyphrases": ["multistate degenerative system", "geometric process equivalent model", "monotone process model", "one-component degenerative system", "failure states", "working state", "two-state one component system", "long-run average cost", "optimal policy", "replacement policy", "renewal reward process"]} -{"id": "1923", "title": "Predictive control of a high temperature-short time pasteurisation process", "abstract": "Modifications on the dynamic matrix control (DMC) algorithm are presented to deal with transfer functions with varying parameters in order to control a high temperature-short time pasteurisation process. To control processes with first order with pure time delay models whose parameters present an exogenous variable dependence, a new method of free response calculation, using multiple model information, is developed. Two methods, to cope with those nonlinear models that allow a generalised Hammerstein model description, are proposed. The proposed methods have been tested, both in simulation and in real cases, in comparison with PID and DMC classic controllers, showing important improvements on reference tracking and disturbance rejection", "keyphrases": ["high temperature-short time pasteurisation process", "predictive control", "dynamic matrix control algorithm", "transfer functions", "first order processes", "time delay models", "exogenous variable dependence", "free response calculation", "multiple model information", "nonlinear models", "generalised Hammerstein model description", "reference tracking", "disturbance rejection"]} -{"id": "1509", "title": "Mathematical modelling of the work of the system of wells in a layer with the exponential law of permeability variation and the mobile liquid interface", "abstract": "We construct and study a two-dimensional model of the work of the system of wells in a layer with the mobile boundary between liquids of various viscosity. We use a 'plunger' displacement model of liquids. The boundaries of the filtration region of these liquids are modelled by curves of the Lyapunov class. Unlike familiar work, we solve two-dimensonal problems in an inhomogeneous layer when the mobile boundary and the boundaries of the filtration region are modelled by curves of the Lyapunov class. We show the practical convergence of the numerical solution of the problems studied", "keyphrases": ["2D model", "work", "well system", "mathematical modelling", "exponential law", "permeability variation", "mobile liquid interface", "mobile boundary", "viscosity", "plunger displacement model", "filtration region boundaries", "Lyapunov class curves", "inhomogeneous layer", "convergence", "numerical solution"]} -{"id": "1886", "title": "Non-asymptotic confidence ellipsoids for the least-squares estimate", "abstract": "We consider the finite sample properties of least-squares system identification, and derive non-asymptotic confidence ellipsoids for the estimate. The shape of the confidence ellipsoids is similar to the shape of the ellipsoids derived using asymptotic theory, but unlike asymptotic theory, they are valid for a finite number of data points. The probability that the estimate belongs to a certain ellipsoid has a natural dependence on the volume of the ellipsoid, the data generating mechanism, the model order and the number of data points available", "keyphrases": ["nonasymptotic confidence ellipsoids", "least-squares estimate", "finite sample properties", "least-squares system identification", "probability", "data generating mechanism", "model order", "data points"]} -{"id": "1750", "title": "A dynamic method for weighted linear least squares problems", "abstract": "A new method for solving the weighted linear least squares problems with full rank is proposed. Based on the theory of Liapunov's stability, the method associates a dynamic system with a weighted linear least squares problem, whose solution we are interested in and integrates the former numerically by an A-stable numerical method. The numerical tests suggest that the new method is more than comparative with current conventional techniques based on the normal equations", "keyphrases": ["dynamic method", "weighted linear least squares problems", "Lyapunov stability", "A-stable numerical method"]} -{"id": "1715", "title": "Information-processing and computing systems at thermal power stations in China", "abstract": "The development and commissioning of information-processing and computing systems (IPCSs) at four power units, each of 500 MW capacity at the thermal power stations Tszisyan' and Imin' in China, are considered. The functional structure and the characteristics of the functions of the IPCSs are presented as is information on the technology of development and experience in adjustments. Ways of using the experience gained in creating a comprehensive functional firmware system are shown", "keyphrases": ["China", "thermal power stations", "information-processing systems", "computing systems", "commissioning", "development", "functional structure", "functions characteristics", "firmware system", "500 MW"]} -{"id": "1728", "title": "A characterization of generalized Pareto distributions by progressive censoring schemes and goodness-of-fit tests", "abstract": "In this paper we generalize a characterization property of generalized Pareto distributions, which is known for ordinary order statistics, to arbitrary schemes of progressive type-II censored order statistics. Various goodness-of-fit tests for generalized Pareto distributions based on progressively censored data statistics are discussed", "keyphrases": ["generalized Pareto distributions", "progressive censoring schemes", "goodness-of-fit tests", "progressive type-II censored order statistics", "ordinary order statistics"]} -{"id": "1803", "title": "Linear complexity of polyphase power residue sequences", "abstract": "The well known family of binary Legendre or quadratic residue sequences can be generalised to the multiple-valued case by employing a polyphase representation. These p-phase sequences, with p prime, also have prime length L, and can be constructed from the index sequence of length L or, equivalently, from the cosets of pth power residues and non-residues modulo-L. The linear complexity of these polyphase sequences is derived and shown to fall into four classes depending on the value assigned to b/sub 0/, the initial digit of the sequence, and on whether p belongs to the set of pth power residues or not. The characteristic polynomials of the linear feedback shift registers that generate these sequences are also derived", "keyphrases": ["linear complexity", "polyphase power residue sequences", "binary Legendre sequences", "quadratic residue sequences", "multiple-valued case", "p-phase sequences", "polynomials", "linear feedback shift registers", "cryptographic applications", "key stream ciphers", "binary sequences"]} -{"id": "1491", "title": "Evaluation of videoconferenced grand rounds", "abstract": "We evaluated various aspects of grand rounds videoconferenced from a tertiary care hospital to a regional hospital in Nova Scotia. During a five-month study period, 29 rounds were broadcast (19 in medicine and 10 in cardiology). The total recorded attendance at the remote site was 103, comprising 70 specialists, nine family physicians and 24 other health-care professionals. We received 55 evaluations, a response rate of 53%. On a five-point Likert scale (on which higher scores indicated better quality), mean ratings by remote-site participants of the technical quality of the videoconference were 3.0-3.5, with the lowest ratings being for ability to hear the discussion (3.0) and to see visual aids (3.1). Mean ratings for content, presentation, discussion and educational value were 3.8 or higher. Of the 49 physicians who presented the rounds, we received evaluations from 41, a response rate of 84%. The presenters rated all aspects of the videoconference and interaction with remote sites at 3.8 or lower. The lowest ratings were for ability to see the remote sites (3.0) and the usefulness of the discussion (3.4). We received 278 evaluations from participants at the presenting site, an estimated response rate of about 55%. The results indicated no adverse opinions of the effect of videoconferencing (mean scores 3.1-3.3). The estimated costs of videoconferencing one grand round to one site and four sites were C$723 and C$1515, respectively. The study confirmed that videoconferenced rounds can provide satisfactory continuing medical education to community specialists, which is an especially important consideration as maintenance of certification becomes mandatory", "keyphrases": ["videoconferenced grand rounds", "tertiary care hospital", "regional hospital", "telemedicine", "cardiology", "health-care professionals", "five-point Likert scale", "remote sites", "continuing medical education", "certification"]} -{"id": "1846", "title": "Semantic B2B integration: issues in ontology-based approaches", "abstract": "Solving queries to support e-commerce transactions can involve retrieving and integrating information from multiple information resources. Often, users don't care which resources are used to answer their query. In such situations, the ideal solution would be to hide from the user the details of the resources involved in solving a particular query. An example would be providing seamless access to a set of heterogeneous electronic product catalogues. There are many problems that must be addressed before such a solution can be provided. In this paper, we discuss a number of these problems, indicate how we have addressed these and go on to describe the proof-of-concept demonstration system we have developed", "keyphrases": ["e-commerce transactions", "queries", "information integration", "information retrieval", "multiple information resources", "heterogeneous electronic product catalogues", "ontology-based approaches", "semantic B2B integration"]} -{"id": "1790", "title": "Copyright of electronic publishing", "abstract": "With the spreading of the Internet and the wide use of computers, electronic publishing is becoming an indispensable measure to gain knowledge and skills. Meanwhile, copyright is facing much more infringement than ever in this electronic environment. So, it is a key factor to effectively protect copyright of electronic publishing to foster the new publication fields. The paper analyzes the importance of copyright, the main causes for copyright infringement in electronic publishing, and presents viewpoints on the definition and application of fair use of a copyrighted work and thinking of some means to combat breach of copyright", "keyphrases": ["electronic publishing copyright", "Internet", "copyright infringement", "electronic environment", "copyright protection", "fair use", "copyrighted work"]} -{"id": "1534", "title": "Generic simulation approach for multi-axis machining. Part 1: modeling methodology", "abstract": "This paper presents a new methodology for analytically simulating multi-axis machining of complex sculptured surfaces. A generalized approach is developed for representing an arbitrary cutting edge design, and the local surface topology of a complex sculptured surface. A NURBS curve is used to represent the cutting edge profile. This approach offers the advantages of representing any arbitrary cutting edge design in a generic way, as well as providing standardized techniques for manipulating the location and orientation of the cutting edge. The local surface topology of the part is defined as those surfaces generated by previous tool paths in the vicinity of the current tool position. The local surface topology of the part is represented without using a computationally expensive CAD system. A systematic prediction technique is then developed to determine the instantaneous tool/part interaction during machining. The methodology employed here determines the cutting edge in-cut segments by determining the intersection between the NURBS curve representation of the cutting edge and the defined local surface topology. These in-cut segments are then utilized for predicting instantaneous chip load, static and dynamic cutting forces, and tool deflection. Part 1 of this paper details the modeling methodology and demonstrates the capabilities of the simulation for machining a complex surface", "keyphrases": ["multiple axis machining", "generic modeling", "tool path specification", "complex surface machining", "complex sculptured surfaces", "systematic prediction", "cutting edge profile", "surface topology", "NURBS curve"]} -{"id": "1571", "title": "The simulated emergence of distributed environmental control in evolving microcosms", "abstract": "This work continues investigation into Gaia theory (Lovelock, The ages of Gaia, Oxford University Press, 1995) from an artificial life perspective (Downing, Proceedings of the 7th International Conference on Artificial Life, p. 90-99, MIT Press, 2000), with the aim of assessing the general compatibility of emergent distributed environmental control with conventional natural selection. Our earlier system, GUILD (Downing and Zvirinsky, Artificial Life, 5, p.291-318, 1999), displayed emergent regulation of the chemical environment by a population of metabolizing agents, but the chemical model underlying those results was trivial, essentially admitting all possible reactions at a single energy cost. The new model, METAMIC, utilizes abstract chemistries that are both (a) constrained to a small set of legal reactions, and (b) grounded in basic fundamental relationships between energy, entropy, and biomass synthesis/breakdown. To explore the general phenomena of emergent homeostasis, we generate 100 different chemistries and use each as the basis for several METAMIC runs, as part of a Gaia hunt. This search discovers 20 chemistries that support microbial populations capable of regulating a physical environmental factor within their growth-optimal range, despite the extra metabolic cost. Case studies from the Gaia hunt illustrate a few simple mechanisms by which real biota might exploit the underlying chemistry to achieve some control over their physical environment. Although these results shed little light on the question of Gaia on Earth, they support the possibility of emergent environmental control at the microcosmic level", "keyphrases": ["simulated emergence", "evolving microcosms", "natural selection", "GUILD system", "metabolizing agents", "chemical model", "METAMIC model", "emergent homeostasis", "Gaia hunt", "genetic algorithms", "artificial chemistry", "artificial metabolisms", "Gaia theory", "artificial life", "emergent distributed environmental control"]} -{"id": "161", "title": "Electronic books: reports of their death have been exaggerated", "abstract": "E-books will survive, but not in the consumer market - at least not until reading devices become much cheaper and much better in quality (which is not likely to happen soon). Library Journal's review of major events of the year 2001 noted that two requirements for the success of E-books were development of a sustainable business model and development of better reading devices. The E-book revolution has therefore become more of an evolution. We can look forward to further developments and advances in the future", "keyphrases": ["electronic books", "E-books", "Library Journal"]} -{"id": "1635", "title": "Simple...But complex", "abstract": "FlexPro 5.0, from Weisang and Co., is one of those products which aim to serve an often ignored range of data users: those who, in FlexPro's words, are interested in documenting, analysing and archiving data in the simplest way possible. The online help system is clearly designed to promote the product in this market segment, with a very clear introduction from first principles and a hands-on tutorial, and the live project to which it was applied was selected with this in mind", "keyphrases": ["FlexPro 5.0", "data archiving", "data analysis", "data documentation", "online help system", "hands-on tutorial"]} -{"id": "1670", "title": "An integrated optimization model for train crew management", "abstract": "Train crew management involves the development of a duty timetable for each of the drivers (crew) to cover a given train timetable in a rail transport organization. This duty timetable is spread over a certain period, known as the roster planning horizon. Train crew management may arise either from the planning stage, when the total number of crew and crew distributions are to be determined, or from the operating stage when the number of crew at each depot is known as input data. In this paper, we are interested in train crew management in the planning stage. In the literature, train crew management is decomposed into two stages: crew scheduling and crew rostering which are solved sequentially. We propose an integrated optimization model to solve both crew scheduling and crew rostering. The model enables us to generate either cyclic rosters or non-cyclic rosters. Numerical experiments are carried out over data sets arising from a practical application", "keyphrases": ["integrated optimization model", "train crew management", "duty timetable", "rail transport organization", "roster planning horizon", "crew scheduling", "crew rostering", "cyclic rosters", "noncyclic rosters", "integer programming"]} -{"id": "1792", "title": "Database technology in digital libraries", "abstract": "Database technology advancements have provided many opportunities for libraries. These advancements can bring the world closer together through information accessibility. Digital library projects have been established worldwide to, ultimately, fulfil the needs of end users through more efficiency and convenience. Resource sharing will continue to be the trend for libraries. Changes often create issues which need to be addressed. Issues relating to database technology and digital libraries are reviewed. Some of the major challenges in digital libraries and managerial issues are identified as well", "keyphrases": ["database technology", "digital libraries", "information accessibility", "digital library projects", "end users", "resource sharing", "managerial issues", "data quality", "interoperability", "metadata", "user interface", "query processing"]} -{"id": "1801", "title": "Least load dispatching algorithm for parallel Web server nodes", "abstract": "A least load dispatching algorithm for distributing requests to parallel Web server nodes is described. In this algorithm, the load offered to a node by a request is estimated based on the expected transfer time of the corresponding reply through the Internet. This loading information is then used by the algorithm to identify the least load node of the Web site. By using this algorithm, each request will always be sent for service at the earliest possible time. Performance comparison using NASA and ClarkNet access logs between the proposed algorithm and commonly used dispatching algorithms is performed. The results show that the proposed algorithm gives 10% higher throughput than that of the commonly used random and round-robin dispatching algorithms", "keyphrases": ["least load dispatching algorithm", "parallel Web server nodes", "Internet", "transfer time", "NASA access logs", "ClarkNet access logs", "throughput", "round-robin dispatching algorithms", "random dispatching algorithms", "World Wide Web server"]} -{"id": "1493", "title": "Research into telehealth applications in speech-language pathology", "abstract": "A literature review was conducted to investigate the extent to which telehealth has been researched within the domain of speech-language pathology and the outcomes of this research. A total of 13 studies were identified. Three early studies demonstrated that telehealth was feasible, although there was no discussion of the cost-effectiveness of this process in terms of patient outcomes. The majority of the subsequent studies indicated positive or encouraging outcomes resulting from telehealth. However, there were a number of shortcomings in the research, including a lack of cost-benefit information, failure to evaluate the technology itself, an absence of studies of the educational and informational aspects of telehealth in relation to speech-language pathology, and the use of telehealth in a limited range of communication disorders. Future research into the application of telehealth to speech-language pathology services must adopt a scientific approach, and have a well defined development and evaluation framework that addresses the effectiveness of the technique, patient outcomes and satisfaction, and the cost-benefit relationship", "keyphrases": ["telehealth applications", "speech-language pathology", "literature review", "telemedicine", "cost-effectiveness", "patient outcomes", "cost-benefit analysis", "communication disorders", "patient satisfaction"]} -{"id": "1844", "title": "A multi-agent system infrastructure for software component marketplace: an ontological perspective", "abstract": "In this paper, we introduce a multi-agent system architecture and an implemented prototype for a software component marketplace. We emphasize the ontological perspective by discussing ontology modeling for the component marketplace, UML extensions for ontology modeling, and the idea of ontology transfer which makes the multi-agent system adapt itself to dynamically changing ontologies", "keyphrases": ["multi-agent system architecture", "software component marketplace", "ontology modeling", "UML extensions", "ontology transfer", "dynamically changing ontologies", "adaptation"]} -{"id": "1637", "title": "What's best practice for open access?", "abstract": "The business of publishing journals is in transition. Nobody knows exactly how it will work in the future, but everybody knows that the electronic publishing revolution will ensure it won't work as it does now. This knowledge has provoked a growing sense of nervous anticipation among those concerned, some edgy and threatened by potential changes to their business, others excited by the prospect of change and opportunity. The paper discusses the open publishing model for dissemination of research", "keyphrases": ["open access", "journal publishing", "electronic publishing", "business", "open publishing model", "research dissemination"]} -{"id": "1672", "title": "Two issues in setting call centre staffing levels", "abstract": "Motivated by a problem facing the Police Communication Centre in Auckland, New Zealand, we consider the setting of staffing levels in a call centre with priority customers. The choice of staffing level over any particular time period (e.g., Monday from 8 am-9 am) relies on accurate arrival rate information. The usual method for identifying the arrival rate based on historical data can, in some cases, lead to considerable errors in performance estimates for a given staffing level. We explain why, identify three potential causes of the difficulty, and describe a method for detecting and addressing such a problem", "keyphrases": ["call centre staffing levels", "police communication centre", "Auckland", "New Zealand", "priority customers", "arrival rate information", "performance estimates", "forecast error", "nonstationarity", "conditional Poisson process"]} -{"id": "1536", "title": "Connection management for QoS service on the Web", "abstract": "The current Web service model treats all requests equivalently, both while being processed by servers and while being transmitted over the network. For some uses, such as multiple priority schemes, different levels of service are desirable. We propose application-level TCP connection management mechanisms for Web servers to provide two different levels of Web service, high and low service, by setting different time-outs for inactive TCP connections. We evaluated the performance of the mechanism under heavy and light loading conditions on the Web server. Our experiments show that, though heavy traffic saturates the network, high level class performance is improved by as much as 25-28%. Therefore, this mechanism can effectively provide QoS guaranteed services even in the absence of operating system and network supports", "keyphrases": ["connection management", "Web service model", "Internet", "TCP connections", "time-outs", "quality of service", "telecommunication traffic", "client server system", "Web transaction"]} -{"id": "163", "title": "Boolean operators and the naive end-user: moving to AND", "abstract": "Since so few end-users make use of Boolean searching, it is obvious that any effective solution needs to take this reality into account. The most important aspect of a technical solution should be that it does not require any effort on the part of users. What is clearly needed is for search engine designers and programmers to take account of the information-seeking behavior of Internet users. Users must be able to enter a series of words at random and have those words automatically treated as a carefully constructed Boolean AND search statement", "keyphrases": ["Boolean operators", "AND operator", "Boolean searching", "search engine design", "information-seeking behavior", "Internet"]} -{"id": "1921", "title": "An ACL for a dynamic system of agents", "abstract": "In this article we present the design of an ACL for a dynamic system of agents. The ACL includes a set of conversation performatives extended with operations to register, create, and terminate agents. The main design goal at the agent-level is to provide only knowledge-level primitives that are well integrated with the dynamic nature of the system. This goal has been achieved by defining an anonymous interaction protocol which enables agents to request and supply knowledge without considering symbol-level issues concerning management of agent names, routing, and agent reachability. This anonymous interaction protocol exploits a distributed facilitator schema which is hidden at the agent-level and provides mechanisms for registering capabilities of agents and delivering requests according to the competence of agents. We present a formal specification of the ACL and of the underlying architecture, exploiting an algebra of actors, and illustrate it with the help of a graphical notation. This approach provides the basis for discussing dynamic primitives in ACL and for studying properties of dynamic multi agent systems, for example concerning the behavior of agents and the correctness of their conversation policies", "keyphrases": ["ACL", "dynamic system of agents", "system of agents", "agents", "Agent Communication Languages", "dynamic system", "distributed facilitator", "actors", "anonymous interaction protocol"]} -{"id": "1879", "title": "On the distribution of Lachlan nonsplitting bases", "abstract": "We say that a computably enumerable (c.e.) degree b is a Lachlan nonsplitting base (LNB), if there is a computably enumerable degree a such that a>b, and for any c.e. degrees w, vor=2, and the function max({x/sub 1/,...,x/sub n/} intersection A) is partial recursive, it is easily seen that A is recursive. In this paper, we weaken this hypothesis in various ways (and similarly for \"min\" in place of \"max\") and investigate what effect this has on the complexity of A. We discover a sharp contrast between retraceable and co-retraceable sets, and we characterize sets which are the union of a recursive set and a co-r.e., retraceable set. Most of our proofs are noneffective. Several open questions are raised", "keyphrases": ["min limiters", "max limiters", "complexity", "retraceable sets", "recursive set"]} -{"id": "1716", "title": "The vibration reliability of poppet and contoured actuator valves", "abstract": "The problem of selecting the shape of the actuator valve (the final control valve) itself is discussed; the solution to this problem will permit appreciable dynamic loads to be eliminated from the moving elements of the steam distribution system of steam turbines under all operating conditions", "keyphrases": ["actuator valve shape selection", "contoured actuator valves", "poppet actuator valves", "dynamic loads elimination", "moving elements", "steam distribution system", "steam turbines", "vibration reliability"]} -{"id": "1753", "title": "Risk theory with a nonlinear dividend barrier", "abstract": "In the framework of classical risk theory we investigate a surplus process in the presence of a nonlinear dividend barrier and derive equations for two characteristics of such a process, the probability of survival and the expected sum of discounted dividend payments. Number-theoretic solution techniques are developed for approximating these quantities and numerical illustrations are given for exponential claim sizes and a parabolic dividend barrier", "keyphrases": ["risk theory", "nonlinear dividend barrier", "surplus process", "probability of survival", "discounted dividend payments", "number-theoretic solution", "numerical illustrations", "exponential claim sizes", "parabolic dividend barrier"]} -{"id": "1885", "title": "Analysis of nonlinear time-delay systems using modules over non-commutative rings", "abstract": "The theory of non-commutative rings is introduced to provide a basis for the study of nonlinear control systems with time delays. The left Ore ring of non-commutative polynomials defined over the field of a meromorphic function is suggested as the framework for such a study. This approach is then generalized to a broader class of nonlinear systems with delays that are called generalized Roesser systems. Finally, the theory is applied to analyze nonlinear time-delay systems. A weak observability is defined and characterized, generalizing the well-known linear result. Properties of closed submodules are then developed to obtain a result on the accessibility of such systems", "keyphrases": ["nonlinear time-delay systems", "modules", "noncommutative rings", "nonlinear control systems", "left Ore ring", "noncommutative polynomials", "meromorphic function", "generalized Roesser systems", "weak observability"]} -{"id": "1920", "title": "To commit or not to commit: modeling agent conversations for action", "abstract": "Conversations are sequences of messages exchanged among interacting agents. For conversations to be meaningful, agents ought to follow commonly known specifications limiting the types of messages that can be exchanged at any point in the conversation. These specifications are usually implemented using conversation policies (which are rules of inference) or conversation protocols (which are predefined conversation templates). In this article we present a semantic model for specifying conversations using conversation policies. This model is based on the principles that the negotiation and uptake of shared social commitments entail the adoption of obligations to action, which indicate the actions that agents have agreed to perform. In the same way, obligations are retracted based on the negotiation to discharge their corresponding shared social commitments. Based on these principles, conversations are specified as interaction specifications that model the ideal sequencing of agent participations negotiating the execution of actions in a joint activity. These specifications not only specify the adoption and discharge of shared commitments and obligations during an activity, but also indicate the commitments and obligations that are required (as preconditions) or that outlive a joint activity (as postconditions). We model the Contract Net Protocol as an example of the specification of conversations in a joint activity", "keyphrases": ["interacting agents", "specifications", "rules of inference", "conversation protocols", "autonomous agents", "social commitments", "speech acts", "software agents", "conversation templates"]} -{"id": "1673", "title": "Mission planning for regional surveillance", "abstract": "The regional surveillance problem discussed involves formulating a flight route for an aircraft to scan a given geographical region. Aerial surveillance is conducted using a synthetic aperture radar device mounted on the aircraft to compose a complete, high-resolution image of the region. Two models for determining an optimised flight route are described, the first employing integer programming and the second, genetic algorithms. A comparison of the solution optimality in terms of the total distance travelled, and model efficiency of the two techniques in terms of their required CPU times, is made in order to identify the conditions under which it is appropriate to apply each model", "keyphrases": ["mission planning", "regional surveillance", "flight route", "geographical region scanning", "aerial surveillance", "synthetic aperture radar device", "high-resolution image", "optimised flight route", "integer programming", "genetic algorithms", "solution optimality", "total distance travelled"]} -{"id": "1636", "title": "SPARC ignites scholarly publishing", "abstract": "During the past several years, initiatives which bring together librarians, researchers, university administrators and independent publishers have re-invigorated the scholarly publishing marketplace. These initiatives take advantage of electronic technology and show great potential for restoring science to scientists. The author outlines SPARC (the Scholarly Publishing and Academic Resources Coalition), an initiative to make scientific journals more accessible", "keyphrases": ["electronic publishing", "initiative", "scientific journal access", "SPARC", "Scholarly Publishing and Academic Resources Coalition"]} -{"id": "1572", "title": "Ant colony optimization and stochastic gradient descent", "abstract": "We study the relationship between the two techniques known as ant colony optimization (ACO) and stochastic gradient descent. More precisely, we show that some empirical ACO algorithms approximate stochastic gradient descent in the space of pheromones, and we propose an implementation of stochastic gradient descent that belongs to the family of ACO algorithms. We then use this insight to explore the mutual contributions of the two techniques", "keyphrases": ["ant colony optimization", "stochastic gradient descent", "empirical ACO algorithms", "pheromones", "combinatorial optimization", "heuristic", "reinforcement learning", "social insects", "swarm intelligence", "artificial life", "local search algorithms"]} -{"id": "1537", "title": "Technology on social issues of videoconferencing on the Internet: a survey", "abstract": "Constant advances in audio/video compression, the development of the multicast protocol as well as fast improvement in computing devices (e.g. higher speed, larger memory) have set forth the opportunity to have resource demanding videoconferencing (VC) sessions on the Internet. Multicast is supported by the multicast backbone (Mbone), which is a special portion of the Internet where this protocol is being deployed. Mbone VC tools are steadily emerging and the user population is growing fast. VC is a fascinating application that has the potential to greatly impact the way we remotely communicate and work. Yet, the adoption of VC is not as fast as one could have predicted. Hence, it is important to examine the factors that affect a widespread adoption of VC. This paper examines the enabling technology and the social issues. It discusses the achievements and identifies the future challenges. It suggests an integration of many emerging multimedia tools into VC in order to enhance its versatility for more effectiveness", "keyphrases": ["videoconferencing", "Internet", "multicast protocol", "multicast backbone", "Mbone", "multimedia", "social issues", "data compression"]} -{"id": "162", "title": "International news sites in English", "abstract": "Web access to news sites all over the world allows us the opportunity to have an electronic news stand readily available and stocked with a variety of foreign (to us) news sites. A large number of currently available foreign sites are English-language publications or English language versions of non-North American sites. These sites are quite varied in terms of quality, coverage, and style. Finding them can present a challenge. Using them effectively requires critical-thinking skills that are a part of media awareness or digital literacy", "keyphrases": ["Web access", "international news sites", "English-language publications", "non North American sites", "critical-thinking skills", "media awareness", "digital literacy"]} -{"id": "1793", "title": "The paradigm of viral communication", "abstract": "The IIW Institute of Information Management (www.IIW.de) is dealing with commercial applications of digital technologies, such as the Internet, digital printing, and many more. A study which has been carried out by the institute, identifies viral messages as a new paradigm of communication, mostly found in the area of Direct Marketing, and - who wonders - mainly within the USA. Viral messages underlie certain principles: (1) prospects and customers of the idea are offered a technology platform providing a possibility to send a message to a majority of persons; (2) there is an emotional or pecuniary incentive to participate. Ideally, niches of needs and market vacua are filled with funny ideas; (3) also, the recipients are facing emotional or pecuniary incentives to contact a majority of further recipients - this induces a snowball effect and the message is spread virally; and (4) the customer is activated as an \"ambassador\" of the piece of information, for instance promoting a product or a company. It is evident that there has been a long lasting history of what we call \"word-of-mouth\" ever since, however bundles of digital technologies empower the viral communication paradigm", "keyphrases": ["viral communication paradigm", "commercial applications", "viral messages", "e-mails", "Internet", "direct marketing", "business", "computer virus"]} -{"id": "1845", "title": "Business data management for business-to-business electronic commerce", "abstract": "Business-to-business electronic commerce (B2B EC) opens up new possibilities for trade. For example, new business partners from around the globe can be found, their offers can be compared, even complex negotiations can be conducted electronically, and a contract can be drawn up and fulfilled via an electronic marketplace. However, sophisticated data management is required to provide such facilities. In this paper, the results of a multi-national project on creating a business-to-business electronic marketplace for small and medium-sized enterprises are presented. Tools for information discovery, protocol-based negotiations, and monitored contract enactment are provided and based on a business data repository. The repository integrates heterogeneous business data with business communication. Specific problems such as multilingual nature, data ownership, and traceability of contracts and related negotiations are addressed and it is shown that the present approach provides efficient business data management for B2B EC", "keyphrases": ["business-to-business electronic commerce", "business data management", "electronic marketplace", "small and medium-sized enterprises", "multi-national project", "information discovery", "protocol-based negotiations", "monitored contract enactment", "business data repository", "heterogeneous business data", "business communication", "data ownership", "multilingual system", "traceability"]} -{"id": "1800", "title": "Multi-output regression using a locally regularised orthogonal least-squares algorithm", "abstract": "The paper considers data modelling using multi-output regression models. A locally regularised orthogonal least-squares (LROLS) algorithm is proposed for constructing sparse multi-output regression models that generalise well. By associating each regressor in the regression model with an individual regularisation parameter, the ability of the multi-output orthogonal least-squares (OLS) model selection to produce a parsimonious model with a good generalisation performance is greatly enhanced", "keyphrases": ["multi-output regression models", "locally regularised orthogonal least-squares algorithm", "data modelling", "sparse multi-output regression models", "parsimonious model", "nonlinear system modelling", "LROLS algorithm"]} -{"id": "1492", "title": "A systematic review of the efficacy of telemedicine for making diagnostic and management decisions", "abstract": "We conducted a systematic review of the literature to evaluate the efficacy of telemedicine for making diagnostic and management decisions in three classes of application: office/hospital-based, store-and-forward, and home-based telemedicine. We searched the MEDLINE, EMBASE, CINAHL and HealthSTAR databases and printed resources, and interviewed investigators in the field. We excluded studies where the service did not historically require face-to-face encounters (e.g. radiology or pathology diagnosis). A total of 58 articles met the inclusion criteria. The articles were summarized and graded for the quality and direction of the evidence. There were very few high-quality studies. The strongest evidence for the efficacy of telemedicine for diagnostic and management decisions came from the specialties of psychiatry and dermatology. There was also reasonable evidence that general medical history and physical examinations performed via telemedicine had relatively good sensitivity and specificity. Other specialties in which some evidence for efficacy existed were cardiology and certain areas of ophthalmology. Despite the widespread use of telemedicine in most major medical specialties, there is strong evidence in only a few of them that the diagnostic and management decisions provided by telemedicine are comparable to face-to-face care", "keyphrases": ["telemedicine", "medical diagnosis", "management decision making", "literature review", "MEDLINE", "EMBASE", "CINAHL", "HealthSTAR", "psychiatry", "dermatology", "cardiology", "ophthalmology"]} -{"id": "1556", "title": "Regularity of some 'incomplete' Pal-type interpolation problems", "abstract": "In this paper the regularity of nine Pal-type interpolation problems is proved. In the literature interpolation on the zeros of the pair W/sub n//sup ( alpha )/(z) = (z + alpha )/sup n/ + (1 + alpha z)/sup n/, v/sub n//sup ( alpha )/(z) = (z + alpha )/sup n/ - (1 + alpha z)/sup n/ with 0 < alpha < 1 has been studied. Here the nodes form a subset of these sets of zeros", "keyphrases": ["Pal-type interpolation problems", "zeros"]} -{"id": "1513", "title": "Solution of the reconstruction problem of a source function in the coagulation-fragmentation equation", "abstract": "We study the problem of reconstructing a source function in the kinetic coagulation-fragmentation equation. The study is based on optimal control methods, the solvability theory of operator equations, and the use of iteration algorithms", "keyphrases": ["source function reconstruction", "kinetic coagulation-fragmentation equation", "optimal control methods", "solvability", "operator equations", "iteration algorithms"]} -{"id": "1657", "title": "Breaking the myths of rewards: an exploratory study of attitudes about knowledge sharing", "abstract": "Many CEO and managers understand the importance of knowledge sharing among their employees and are eager to introduce the knowledge management paradigm in their organizations. However little is known about the determinants of the individual's knowledge sharing behavior. The purpose of this study is to develop an understanding of the factors affecting the individual's knowledge sharing behavior in the organizational context. The research model includes various constructs based on social exchange theory, self-efficacy, and theory of reasoned action. Research results from the field survey of 467 employees of four large, public organizations show that expected associations and contribution are the major determinants of the individual's attitude toward knowledge sharing. Expected rewards, believed by many to be the most important motivating factor for knowledge sharing, are not significantly related to the attitude toward knowledge sharing. As expected, positive attitude toward knowledge sharing is found to lead to positive intention to share knowledge and, finally, to actual knowledge sharing behaviors", "keyphrases": ["knowledge sharing", "knowledge management", "social exchange theory", "self-efficacy", "theory of reasoned action", "public organizations", "rewards", "strategic management"]} -{"id": "1861", "title": "Technology in distance education: a global perspective to alternative delivery mechanisms", "abstract": "Technology is providing a positive impact on delivery mechanisms employed in distance education at the university level. Some institutions are incorporating distance education as a way to extend the classroom. Other institutions are investigating new delivery mechanisms, which support a revised perspective on education. These latter institutions are revising their processes for interacting with students, and taking a more \"learner centered\" approach to the delivery of education. This article discusses the impact of technology on the delivery mechanisms employed in distance education. A framework is proposed here, which presents a description of alternative modes of generic delivery mechanisms. It is suggested that those institutions, which adopt a delivery mechanism employing an asynchronous mode, can gain the most benefit from technology. This approach seems to represent the only truly innovative use of technology in distance education. The approach creates a student-oriented environment while maintaining high levels of interaction, both of which are factors that contribute to student satisfaction with their overall educational experience", "keyphrases": ["distance education", "educational technology", "university education", "learner centered approach", "student satisfaction", "global perspective", "asynchronous mode"]} -{"id": "1824", "title": "Parallel operation of capacity-limited three-phase four-wire active power filters", "abstract": "Three-phase four-wire active power filters (APFs) are presented that can be paralleled to enlarge the system capacity and reliability. The APF employs the PWM four-leg voltage-source inverter. A decoupling control approach for the leg connected to the neutral line is proposed such that the switching of all legs has no interaction. Functions of the proposed APF include compensation of reactive power, harmonic current, unbalanced power and zero-sequence current of the load. The objective is to achieve unity power factor, balanced line current and zero neutral-line current. Compensation of all components is capacity-limited, co-operating with the cascaded load current sensing scheme. Multiple APFs can be paralleled to share the load power without requiring any control interconnection. In addition to providing the theoretic bases and detailed design of the APFs, two 6 kVA APFs are implemented. The effectiveness of the proposed method is validated with experimental results", "keyphrases": ["capacity-limited three-phase four-wire active power filters", "parallel operation", "PWM four-leg voltage-source inverter", "decoupling control approach", "leg switching", "control design", "reactive power compensation", "harmonic current compensation", "unbalanced power compensation", "zero-sequence load current compensation", "unity power factor", "balanced line current", "zero neutral-line current", "load power sharing", "control performance", "6 kVA"]} -{"id": "1476", "title": "The perceived utility of human and automated aids in a visual detection task", "abstract": "Although increases in the use of automation have occurred across society, research has found that human operators often underutilize (disuse) and overly rely on (misuse) automated aids (Parasuraman-Riley (1997)). Nearly 275 Cameron University students participated in 1 of 3 experiments performed to examine the effects of perceived utility (Dzindolet et al. (2001)) on automation use in a visual detection task and to compare reliance on automated aids with reliance on humans. Results revealed a bias for human operators to rely on themselves. Although self-report data indicate a bias toward automated aids over human aids, performance data revealed that participants were more likely to disuse automated aids than to disuse human aids. This discrepancy was accounted for by assuming human operators have a \"perfect automation\" schema. Actual or potential applications of this research include the design of future automated decision aids and training procedures for operators relying on such aids", "keyphrases": ["automated aids", "visual detection task", "human operators", "automated decision aids", "social process", "automation"]} -{"id": "1732", "title": "Community spirit", "abstract": "IT companies that contribute volunteers, resources or funding to charities and local groups not only make a real difference to their communities but also add value to their businesses. So says a new coalition of IT industry bodies formed to raise awareness of the options for community involvement, promote the business case, and publicise examples of best practice. The BCS, Intellect (formed from the merger of the Computing Services and Software Association and the Federation of the Electronics Industry) and the Worshipful Company of Information Technologists plan to run advisory seminars and provide guidelines on how companies of all sizes can transform their local communities using their specialist IT skills and resources while reaping business benefits", "keyphrases": ["IT companies", "volunteer staff", "resource contribution", "charity projects", "community projects", "staff development", "business benefits", "best practice"]} -{"id": "1777", "title": "Midlife career choices: how are they different from other career choices?", "abstract": "It was 1963 when Candy Start began working in libraries. Libraries seemed to be a refuge from change, a dependable environment devoted primarily to preservation. She was mistaken. Technological changes in every decade of her experience have affected how and where she used her MLS. Far from a static refuge, libraries have proven to be spaceships loaded with precious cargo hurtling into the unknown. The historian in the author says that perhaps libraries have always been like this. This paper looks at a midlife decision point and the choice that this librarian made to move from a point of lessening productivity and interest to one of increasing challenge and contribution. It is a personal narrative of midlife experience from one librarian's point of view. Since writing this article, Candy's career has followed more changes. After selling the WINGS TM system, she has taken her experiences and vision to another library vendor, Gaylord Information Systems, where she serves as a senior product strategist", "keyphrases": ["midlife career choices", "libraries", "technological changes", "productivity"]} -{"id": "1819", "title": "Structural interpretation of matched pole-zero discretisation", "abstract": "Deals with matched pole-zero discretisation, which has been used in practice for hand calculations in the digital redesign of continuous-time systems but available only in the transfer-function form. Since this form is inconvenient for characterising the time-domain properties of sampled-data loops and for computerising the design of such systems, a state-space formulation is developed. Under the new interpretation, the matched pole-zero model is shown to be structurally identical to a hold-equivalent discrete-time model, where the generalised hold takes integral part, thus unifying the most widely used discretisation approaches. An algorithm for obtaining the generalised hold function is presented. The hold-equivalent structure of the matched pole-zero model clarifies several discrete-time system properties, such as controllability and observability, and their preservation or loss with a matched pole-zero discretisation. With the proposed formulation, the matched pole-zero, hold-equivalent, and mapping models can now all be constructed with a single schematic model", "keyphrases": ["structural interpretation", "matched pole-zero discretisation", "continuous-time systems", "time-domain properties", "sampled-data loops", "state-space formulation", "hold-equivalent discrete-time model", "controllability", "observability", "closed-loop system", "digital simulations"]} -{"id": "186", "title": "The diameter of a long-range percolation graph", "abstract": "We consider the following long-range percolation model: an undirected graph with the node set {0, 1, . . . , N}/sup d/, has edges (x, y) selected with probability approximately= beta /||x - y||/sup s/ if ||x - y|| > 1, and with probability 1 if ||x - y|| = 1, for some parameters beta , s > 0. This model was introduced by who obtained bounds on the diameter of this graph for the one-dimensional case d = 1 and for various values of s, but left cases s = 1, 2 open. We show that, with high probability, the diameter of this graph is Theta (log N/log log N) when s = d, and, for some constants 0 < eta /sub 1/ < eta /sub 2/ < 1, it is at most N/sup eta 2/ when s = 2d, and is at least N/sup eta 1/ when d = 1, s = 2, beta < 1 or when s > 2d. We also provide a simple proof that the diameter is at most log/sup O(1)/ N with high probability, when d < s < 2d, established previously in Benjamini and Berger (2001)", "keyphrases": ["long-range percolation model", "undirected graph", "probability", "percolation", "positive probability", "networks", "random graph"]} -{"id": "1904", "title": "Component support in PLT scheme", "abstract": "PLT Scheme (DrScheme and MzScheme) supports the Component Object Model (COM) standard with two pieces of software. The first piece is MzCOM, a COM class that makes a Scheme evaluator available to COM clients. With MzCOM, programmers can embed Scheme code in programs written in mainstream languages such as C++ or Visual BASIC. Some applications can also be used as MzCOM clients. The other piece of component-support software is MysterX, which makes COM classes available to PLT Scheme programs. When needed, MysterX uses a programmable Web browser to display COM objects. We describe the technical issues encountered in building these two systems and sketch some applications", "keyphrases": ["PLT Scheme", "Component Object Model", "MzCOM", "reuse", "Web browser"]} -{"id": "1596", "title": "Wavelet collocation methods for a first kind boundary integral equation in acoustic scattering", "abstract": "In this paper we consider a wavelet algorithm for the piecewise constant collocation method applied to the boundary element solution of a first kind integral equation arising in acoustic scattering. The conventional stiffness matrix is transformed into the corresponding matrix with respect to wavelet bases, and it is approximated by a compressed matrix. Finally, the stiffness matrix is multiplied by diagonal preconditioners such that the resulting matrix of the system of linear equations is well conditioned and sparse. Using this matrix, the boundary integral equation can be solved effectively", "keyphrases": ["first kind integral operators", "piecewise constant collocation", "wavelet algorithm", "boundary element solution", "boundary integral equation", "wavelet transform", "computational complexity", "acoustic scattering", "stiffness matrix", "linear equations"]} -{"id": "1697", "title": "Exact frequency-domain reconstruction for thermoacoustic tomography. II. Cylindrical geometry", "abstract": "For pt. I see ibid., vol. 21, no. 7, p. 823-8 (2002). Microwave-induced thermoacoustic tomography (TAT) in a cylindrical configuration is developed to image biological tissue. Thermoacoustic signals are acquired by scanning a flat ultrasonic transducer. Using a new expansion of a spherical wave in cylindrical coordinates, we apply the Fourier and Hankel transforms to TAT and obtain an exact frequency-domain reconstruction method. The effect of discrete spatial sampling on image quality is analyzed. An aliasing-proof reconstruction method is proposed. Numerical and experimental results are included", "keyphrases": ["medical diagnostic imaging", "frequency-domain reconstruction", "flat ultrasonic transducer", "thermoacoustic tomography", "cylindrical geometry", "discrete spatial sampling effect", "ultrasound imaging", "spherical wave expansion", "aliasing-proof reconstruction method", "Hankel transform"]} -{"id": "1489", "title": "An eight-year study of Internet-based remote medical counselling", "abstract": "We carried out a prospective study of an Internet-based remote counselling service. A total of 15,456 Internet users visited the Web site over eight years. From these, 1500 users were randomly selected for analysis. Medical counselling had been granted to 901 of the people requesting it (60%). One hundred and sixty-four physicians formed project groups to process the requests and responded using email. The distribution of patients using the service was similar to the availability of the Internet: 78% were from the European Union, North America and Australia. Sixty-seven per cent of the patients lived in urban areas and the remainder were residents of remote rural areas with limited local medical coverage. Sixty-five per cent of the requests were about problems of internal medicine and 30% of the requests concerned surgical issues. The remaining 5% of the patients sought information about recent developments, such as molecular medicine or aviation medicine. During the project, our portal became inaccessible five times, and counselling was not possible on 44 days. There was no hacking of the Web site. Internet-based medical counselling is a helpful addition to conventional practice", "keyphrases": ["Internet-based remote medical counselling", "Internet users", "Web site", "email", "urban areas", "remote rural areas", "surgical issues", "telemedicine", "medical education", "portal"]} -{"id": "1730", "title": "Meeting of minds", "abstract": "Technical specialists need to think about their role in IT projects and how they communicate with end-users and other participants to ensure they contribute fully as team members. It is especially important to communicate and document trade-offs that may have to be made, including the rationale behind them, so that if requirements change, the impact and decisions can be readily communicated to the stakeholders", "keyphrases": ["technical specialists", "IT projects", "communication", "end-users"]} -{"id": "1775", "title": "Are we there yet?: facing the never-ending speed and change of technology in midlife", "abstract": "This essay is a personal reflection on entering librarianship in middle age at a time when the profession, like society in general, is experiencing rapidly accelerating change. Much of this change is due to the increased use of computers and information technologies in the library setting. These aids in the production, collection, storage, retrieval, and dissemination of the collective information, knowledge, and sometimes wisdom of the past and the contemporary world can exhilarate or burden depending on one's worldview, the organization, and the flexibility of the workplace. This writer finds herself working in a library where everyone is expected continually to explore and use new ways of working and providing library service to a campus and a wider community. No time is spent in reflecting on what was, but all efforts are to anticipate and prepare for what will be", "keyphrases": ["librarianship", "middle age", "changing technology", "computers", "information technologies", "dissemination", "retrieval", "storage", "collection"]} -{"id": "1788", "title": "Resolving Web user on the fly", "abstract": "Identity authentication systems and procedures are rapidly becoming central issues in the practice and study of information systems development and security. Requirements for Web transaction security (WTS) include strong authentication of a user, non-repudiation and encryption of all traffic. In this paper, we present an effective mechanism involving two different channels, which addresses the prime concerns involved in the security of electronic commerce transactions (ECT) viz. user authentication and non-repudiation. Although the product is primarily targeted to provide a fillip to transactions carried out over the Web, this product can also be effectively used for non-Internet transactions that are carried out where user authentication is required", "keyphrases": ["identity authentication systems", "information systems development", "information systems security", "Web transaction security", "nonrepudiation", "encryption", "traffic", "electronic commerce transactions"]} -{"id": "1474", "title": "Contrast sensitivity in a dynamic environment: effects of target conditions and visual impairment", "abstract": "Contrast sensitivity was determined as a function of target velocity (0 degrees -120 degrees /s) over a variety of viewing conditions. In Experiment 1, measurements of dynamic contrast sensitivity were determined for observers as a function of target velocity for letter stimuli. Significant main effects were found for target velocity, target size, and target duration, but significant interactions among the variables indicated especially pronounced adverse effects of increasing target velocity for small targets and brief durations. In Experiment 2, the effects of simulated cataracts were determined. Although the simulated impairment had no effect on traditional acuity scores, dynamic contrast sensitivity was markedly reduced. Results are discussed in terms of dynamic contrast sensitivity as a useful composite measure of visual functioning that may provide a better overall picture of an individual's visual functioning than does traditional static acuity, dynamic acuity, or contrast sensitivity alone. The measure of dynamic contrast sensitivity may increase understanding of the practical effects of various conditions, such as aging or disease, on the visual system, or it may allow improved prediction of individuals' performance in visually dynamic situations", "keyphrases": ["contrast sensitivity", "dynamic environment", "target conditions", "visual impairment", "dynamic contrast sensitivity", "target velocity", "target size", "target duration", "acuity scores", "aging", "disease"]} -{"id": "1695", "title": "Medical image computing at the Institute of Mathematics and Computer Science in Medicine, University Hospital Hamburg-Eppendorf", "abstract": "The author reviews the history of medical image computing at his institute, summarizes the achievements, sketches some of the difficulties encountered, and draws conclusions that might be of interest especially to people new to the field. The origin and history section provides a chronology of this work, emphasizing the milestones reached during the past three decades. In accordance with the author's group's focus on imaging, the paper is accompanied by many pictures, some of which, he thinks, are of historical value", "keyphrases": ["Institute of Mathematics and Computer Science in Medicine", "University Hospital Hamburg-Eppendorf", "medical image computing history", "historical value", "difficulties encountered", "medical diagnostic imaging", "work chronology"]} -{"id": "1569", "title": "An interactive self-replicator implemented in hardware", "abstract": "Self-replicating loops presented to date are essentially worlds unto themselves, inaccessible to the observer once the replication process is launched. We present the design of an interactive self-replicating loop of arbitrary size, wherein the user can physically control the loop's replication and induce its destruction. After introducing the BioWall, a reconfigurable electronic wall for bio-inspired applications, we describe the design of our novel loop and delineate its hardware implementation in the wall", "keyphrases": ["interactive self-replicator", "interactive self-replicating loop", "BioWall", "reconfigurable electronic wall", "bio-inspired applications", "hardware implementation", "self-replication", "field programmable gate array", "cellular automata", "reconfigurable computing", "artificial life"]} -{"id": "179", "title": "Document-based workflow modeling: a case-based reasoning approach", "abstract": "A workflow model is useful for business process analysis. A well-built workflow can help a company streamline its internal processes by reducing overhead. The results of workflow modeling need to be managed as information assets in a systematic fashion. Reusing these results is likely to enhance the quality of the modeling. Therefore, this paper proposes a document-based workflow modeling mechanism, which employs a case-based reasoning (CBR) technique for the effective reuse of design outputs. A repository is proposed to support this CBR process. A real-life case is illustrated to demonstrate the usefulness of our approach", "keyphrases": ["document-based workflow modeling", "case-based reasoning", "business process analysis", "company", "information assets", "design output reuse"]} -{"id": "184", "title": "On the expected value of the minimum assignment", "abstract": "The minimum k-assignment of an m*n matrix X is the minimum sum of k entries of X, no two of which belong to the same row or column. Coppersmith and Sorkin conjectured that if X is generated by choosing each entry independently from the exponential distribution with mean 1, then the expected value of its minimum k-assignment is given by an explicit formula, which has been proven only in a few cases. In this paper we describe our efforts to prove the Coppersmith-Sorkin conjecture by considering the more general situation where the entries x/sub ij/ of X are chosen independently from different distributions. In particular, we require that x/sub ij/ be chosen from the exponential distribution with mean 1/r/sub i/c/sub j/. We conjecture an explicit formula for the expected value of the minimum k-assignment of such X and give evidence for this formula", "keyphrases": ["minimum k-assignment", "m * n matrix", "exponential distribution", "rational function", "bipartite graph"]} -{"id": "1906", "title": "Integrated process control using an in situ sensor for etch", "abstract": "The migration to tighter geometries and more complex process sequence integration schemes requires having the ability to compensate for upstream deviations from target specifications. Doing so ensures that-downstream process sequences operate on work-in-progress that is well within control. Because point-of-use visibility of work-in-progress quality has become of paramount concern in the industry's drive to reduce scrap and improve yield, controlling trench depth has assumed greater importance. An integrated, interferometric based, rate monitor for etch-to-depth and spacer etch applications has been developed for controlling this parameter. This article demonstrates that the integrated rate monitor, using polarization and digital signal processing, enhances control etch-to-depth processes and can also be implemented as a predictive endpoint in a wafer manufacturing environment for dual damascene trench etch and spacer etch applications", "keyphrases": ["interferometric in situ etch sensor", "integrated process control", "polarization", "digital signal processing", "wafer manufacturing environment", "process predictive endpoint", "dual damascene trench etch", "spacer etch applications", "IC geometry", "complex process sequence integration schemes", "upstream deviation compensation", "target specifications", "downstream process sequences", "point-of-use visibility", "work-in-progress quality", "scrap reduction", "yield improvement", "trench depth control", "interferometry", "integrated etch rate monitor"]} -{"id": "1594", "title": "Training multilayer perceptrons via minimization of sum of ridge functions", "abstract": "Motivated by the problem of training multilayer perceptrons in neural networks, we consider the problem of minimizing E(x)= Sigma /sub i=1//sup n/ f/sub i/( xi /sub i/.x), where xi /sub i/ in R/sup S/, 1or= 0} are investigated, where || . ||/sub p/ is the usual vector norm in C/sup n/ resp. R/sup n/, for p epsilon [1, o infinity ]. Moreover, formulae for the first three right derivatives D/sub +//sup k/||s(t)||/sub p/, k = 1, 2,3 are determined. These formulae are applied to vibration problems by computing the best upper bounds on ||s(t)||/sub p/ in certain classes of bounds. These results cannot be obtained by the methods used so far. The systematic use of the differential calculus for vector norms, as done here for the first time, could lead to major advances also in other branches of mathematics and other sciences", "keyphrases": ["differential calculus", "vector functions", "mapping", "vibration problems", "vector norms"]} -{"id": "1511", "title": "Efficient algorithms for stiff elliptic problems with large parameters", "abstract": "We consider a finite element approximation and iteration algorithms for solving stiff elliptic boundary value problems with large parameters in front of a higher derivative. The convergence rate of the algorithms is independent of the spread in coefficients and a discretization parameter", "keyphrases": ["finite element approximation", "iteration algorithms", "stiff elliptic boundary value problems", "large parameters", "higher derivative", "efficient algorithms", "convergence rate"]} -{"id": "1863", "title": "Information systems project failure: a comparative study of two countries", "abstract": "Many organizations, regardless of size, engage in at least one, and often many information system projects each year. Many of these projects consume massive amounts of resources, and may cost as little as a few thousand dollars to ten, and even hundreds of millions of dollars. Needless to say, the investment of time and resources into these ventures are of significant concern to chief information officers (CIOs), executives staff members, project managers, and others in leadership positions. This paper describes the results of a survey performed between Australia and the United States regarding factors leading to IS project failure. The findings suggest that, among other things, end user involvement and executive management leadership are key indicators influencing IS project failure", "keyphrases": ["information systems project failure", "Australia", "United States", "end user involvement", "executive management leadership"]} -{"id": "1826", "title": "Modeling shape and topology of low-resolution density maps of biological macromolecules", "abstract": "We develop an efficient way of representing the geometry and topology of volumetric datasets of biological structures from medium to low resolution, aiming at storing and querying them in a database framework. We make use of a new vector quantization algorithm to select the points within the macromolecule that best approximate the probability density function of the original volume data. Connectivity among points is obtained with the use of the alpha shapes theory. This novel data representation has a number of interesting characteristics, such as (1) it allows us to automatically segment and quantify a number of important structural features from low-resolution maps, such as cavities and channels, opening the possibility of querying large collections of maps on the basis of these quantitative structural features; (2) it provides a compact representation in terms of size; (3) it contains a subset of three-dimensional points that optimally quantify the densities of medium resolution data; and (4) a general model of the geometry and topology of the macromolecule (as opposite to a spatially unrelated bunch of voxels) is easily obtained by the use of the alpha shapes theory", "keyphrases": ["geometry", "topology", "volumetric datasets", "biological structures", "database framework", "vector quantization algorithm", "low-resolution density maps", "biological macromolecules", "modeling", "probability density function", "data representation", "structural features", "cavities", "channels", "connectivity", "compact representation", "three-dimensional points", "medium resolution data", "general model", "original volume data", "alpha shapes theory"]} -{"id": "1748", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. I. Lamina theory", "abstract": "It is well known that a structural design with isotropic materials can only be accomplished based on a stress failure criterion. This is, however, generally not true with laminated composites. Only when the laminate is subjected to an in-plane load, can the ultimate failure of the laminate correspond to its last-ply failure, and hence a stress failure criterion may be sufficient to detect the maximum load that can be sustained by the laminate. Even in such a case, the load shared by each lamina in the laminate cannot be correctly determined if the lamina instantaneous stiffness matrix is inaccurately provided, since the lamina is always statically indeterminate in the laminate. If, however, the laminate is subjected to a lateral load, its ultimate failure occurs before last-ply failure and use of the stress failure criterion is no longer sufficient; an additional critical deflection or curvature condition must also be employed. This necessitates development of an efficient constitutive relationship for laminated composites in order that the laminate strains/deflections up to ultimate failure can be accurately calculated. A general constitutive description for the thermomechanical response of a fibrous laminate up to ultimate failure with applications to various fibrous laminates is presented in the two papers. The constitutive relationship is obtained by combining classical lamination theory with a recently developed bridging micromechanics model, through a layer-by-layer analysis. This paper focuses on lamina analysis", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "lamina theory", "structural design", "isotropic materials", "stress failure criterion", "in-plane load", "instantaneous stiffness matrix", "lateral load", "last-ply failure", "critical deflection condition", "critical curvature condition", "composites", "laminate strains", "laminate deflections", "thermomechanical response", "layer-by-layer analysis", "micromechanics model", "multidirectional tape laminae", "woven fabric composites", "braided fabric composites", "knitted fabric reinforced composites", "elastoplasticity", "elastic-viscoplasticity"]} -{"id": "1570", "title": "Self-reproduction in three-dimensional reversible cellular space", "abstract": "Due to inevitable power dissipation, it is said that nano-scaled computing devices should perform their computing processes in a reversible manner. This will be a large problem in constructing three-dimensional nano-scaled functional objects. Reversible cellular automata (RCA) are used for modeling physical phenomena such as power dissipation, by studying the dissipation of garbage signals. We construct a three-dimensional self-inspective self-reproducing reversible cellular automaton by extending the two-dimensional version SR/sub 8/. It can self-reproduce various patterns in three-dimensional reversible cellular space without dissipating garbage signals", "keyphrases": ["self-reproduction", "nano-scaled computing devices", "power dissipation", "3D self-inspective self-reproducing cellular automata", "reversible cellular automata", "artificial life", "three-dimensional reversible cellular space"]} -{"id": "1535", "title": "Hot controllers", "abstract": "Over the last few years, the semiconductor industry has put much emphasis on ways to improve the accuracy of thermal mass flow controllers (TMFCs). Although issues involving TMFC mounting orientation and pressure effects have received much attention, little has been done to address the effect of changes in ambient temperature or process gas temperature. Scientists and engineers at Qualiflow have succeeded to solve the problem using a temperature correction algorithm for digital TMFCs. Using an in situ environmental temperature compensation technique, we calculated correction factors for the temperature effect and obtained satisfactory results with both the traditional sensor and the new, improved thin-film sensors", "keyphrases": ["semiconductor manufacturing", "process gas flow", "thermal mass flow controller", "temperature correction algorithm", "in situ environmental temperature compensation"]} -{"id": "160", "title": "Taming the paper tiger [paperwork organization]", "abstract": "Generally acknowledged as a critical problem for many information professionals, the massive flow of documents, paper trails, and information needs efficient and dependable approaches for processing and storing and finding items and information", "keyphrases": ["paperwork organization", "information professionals", "information processing", "information storage", "information retrieval"]} -{"id": "1671", "title": "Cane railway scheduling via constraint logic programming: labelling order and constraints in a real-life application", "abstract": "In Australia, cane transport is the largest unit cost in the manufacturing of raw sugar, making up around 35% of the total manufacturing costs. Producing efficient schedules for the cane railways can result in significant cost savings. The paper presents a study using constraint logic programming (CLP) to solve the cane transport scheduling problem. Tailored heuristic labelling order and constraints strategies are proposed and encouraging results of application to several test problems and one real-life case are presented. The preliminary results demonstrate that CLP can be used as an effective tool for solving the cane transport scheduling problem, with a potential decrease in development costs of the scheduling system. It can also be used as an efficient tool for rescheduling tasks which the existing cane transport scheduling system cannot perform well", "keyphrases": ["cane railway scheduling", "constraint logic programming", "cane transport", "raw sugar", "total manufacturing costs", "cost savings", "heuristic labelling order", "constraints strategies"]} -{"id": "1634", "title": "Maple 8 keeps everyone happy", "abstract": "The author is impressed with the upgrade to the mathematics package Maple 8, finding it genuinely useful to scientists and educators. The developments Waterloo Maple class as revolutionary include a student calculus package, and Maplets. The first provides a high-level command set for calculus exploration and plotting (removing the need to work with, say, plot primitives). The second is a package for hand-coding custom graphical user interfaces (GUIs) using elements such as check boxes, radio buttons, slider bars and pull-down menus. When called, a Maplet launches a runtime Java environment that pops up a window-analogous to a Java applet-to perform a programmed routine, if required passing the result back to the Maple worksheet", "keyphrases": ["Maple 8 mathematics package", "student calculus package", "high-level command set", "calculus exploration", "calculus plotting", "GUIs", "Maplet", "runtime Java environment"]} -{"id": "1729", "title": "Maintaining e-commerce", "abstract": "E-commerce over the Web has created a relatively new type of information system. So it is hardly surprising that little attention has been given to the maintenance of such systems-and even less to attempting to develop them with future maintenance in mind. But there are various ways e-commerce systems can be developed to reduce future maintenance", "keyphrases": ["e-commerce systems maintenance", "Web systems"]} -{"id": "1847", "title": "Conceptual modeling and specification generation for B2B business processes based on ebXML", "abstract": "In order to support dynamic setup of business processes among independent organizations, a formal standard schema for describing the business processes is basically required. The ebXML framework provides such a specification schema called BPSS (Business Process Specification Schema) which is available in two standalone representations: a UML version, and an XML version. The former, however, is not intended for the direct creation of business process specifications, but for defining specification elements and their relationships required for creating an ebXML-compliant business process specification. For this reason, it is very important to support conceptual modeling that is well organized and directly matched with major modeling concepts. This paper deals with how to represent and manage B2B business processes using UML-compliant diagrams. The major challenge is to organize UML diagrams in a natural way that is well suited to the business process meta-model and then to transform the diagrams into an XML version. This paper demonstrates the usefulness of conceptually modeling business processes by prototyping a business process editor tool called ebDesigner", "keyphrases": ["B2B business processes", "ebXML", "conceptual modeling", "specification generation", "formal standard schema", "Business Process Specification Schema", "UML-compliant diagrams", "meta model", "ebDesigner", "business process editor"]} -{"id": "1802", "title": "Novel TCP congestion control scheme and its performance evaluation", "abstract": "A novel self-tuning proportional and derivative (ST-PD) control based TCP congestion control scheme is proposed. The new scheme approaches the congestion control problem from a control-theoretical perspective and overcomes several Important limitations associated with existing TCP congestion control schemes, which are heuristic based. In the proposed scheme, a PD controller is employed to keep the buffer occupancy of the bottleneck node on the connection path at an ideal operating level, and it adjusts the TCP window accordingly. The control gains of the PD controller are tuned online by a fuzzy logic controller based on the perceived bandwidth-delay product of the TCP connection. This scheme gives ST-PD TCP several advantages over current TCP implementations. These include rapid response to bandwidth variations, insensitivity to buffer sizes, and significant improvement of TCP throughput over lossy links by decoupling congestion control and error control functions of TCP", "keyphrases": ["TCP congestion control scheme", "performance evaluation", "self-tuning proportional-derivative control", "control-theoretical perspective", "PD controller", "buffer occupancy", "bottleneck node", "connection path", "fuzzy logic controller", "bandwidth-delay product", "lossy links"]} -{"id": "1490", "title": "Client satisfaction in a feasibility study comparing face-to-face interviews with telepsychiatry", "abstract": "We carried out a pilot study comparing satisfaction levels between psychiatric patients seen face to face (FTF) and those seen via videoconference. Patients who consented were randomly assigned to one of two groups. One group received services in person (FTF from the visiting psychiatrist) while the other was seen using videoconferencing at 128 kbit/s. One psychiatrist provided all the FTF and videoconferencing assessment and follow-up visits. A total of 24 subjects were recruited. Three of the subjects (13%) did not attend their appointments and two subjects in each group were lost to follow-up. Thus there were nine in the FTF group and eight in the videoconferencing group. The two groups were similar in most respects. Patient satisfaction with the services was assessed using the Client Satisfaction Questionnaire (CSQ-8), completed four months after the initial consultation. The mean scores were 25.3 in the FTF group and 21.6 in the videoconferencing group. Although there was a trend in favour of the FTF service, the difference was not significant. Patient satisfaction is only one component of evaluation. The efficacy of telepsychiatry must also be measured relative to that of conventional, FTF care before policy makers can decide how extensively telepsychiatry should be implemented", "keyphrases": ["client satisfaction", "face-to-face interviews", "telepsychiatry", "psychiatric patient satisfaction", "human factors", "videoconference", "Client Satisfaction Questionnaire", "telemedicine", "128 kbit/s"]} -{"id": "1791", "title": "The pedagogy of on-line learning: a report from the University of the Highlands and Islands Millennium Institute", "abstract": "Authoritative sources concerned with computer-aided learning, resource-based learning and on-line learning and teaching are generally agreed that, in addition to subject matter expertise and technical support, the quality of the learning materials and the learning experiences of students are critically dependent on the application of pedagogically sound theories of learning and teaching and principles of course design. The University of the Highlands and Islands Project (UHIMI) is developing \"on-line learning\" on a large scale. These developments have been accompanied by a comprehensive programme of staff development. A major emphasis of the programme is concerned with ensuring that course developers and tutors are pedagogically aware. This paper reviews (i) what is meant by \"on-line learning\" in the UHIMI context (ii) the theories of learning and teaching and principles of course design that inform the staff development programme and (iii) a review of progress to date", "keyphrases": ["online learning", "pedagogy", "computer-aided learning", "resource-based learning", "teaching", "technical support", "educational course design", "distance education", "Internet", "University of the Highlands and Islands Project", "staff development"]} -{"id": "1887", "title": "Doubly invariant equilibria of linear discrete-time games", "abstract": "The notion of doubly invariant (DI) equilibrium is introduced. The concept extends controlled and robustly controlled invariance notions to the context of two-person dynamic games. Each player tries to keep the state in a region of state space independently of the actions of the rival player. The paper gives existence conditions, criteria and algorithms for the determination of DI equilibria of linear dynamic games in discrete time. Two examples illustrate the results. The first one is in the area of fault-tolerant controller synthesis. The second is an application to macroeconomics", "keyphrases": ["doubly invariant equilibria", "linear discrete-time games", "robustly controlled invariance", "two-person dynamic games", "state space", "existence conditions", "fault-tolerant controller synthesis", "macroeconomics"]} -{"id": "1714", "title": "Hordes: a multicast based protocol for anonymity", "abstract": "With widespread acceptance of the Internet as a public medium for communication and information retrieval, there has been rising concern that the personal privacy of users can be eroded by cooperating network entities. A technical solution to maintaining privacy is to provide anonymity. We present a protocol for initiator anonymity called Hordes, which uses forwarding mechanisms similar to those used in previous protocols for sending data, but is the first protocol to make use of multicast routing to anonymously receive data. We show this results in shorter transmission latencies and requires less work of the protocol participants, in terms of the messages processed. We also present a comparison of the security and anonymity of Hordes with previous protocols, using the first quantitative definition of anonymity and unlinkability. Our analysis shows that Hordes provides anonymity in a degree similar to that of Crowds and Onion Routing, but also that Hordes has numerous performance advantages", "keyphrases": ["Hordes", "protocol", "Internet", "personal privacy", "cooperating network entities", "initiator anonymity", "forwarding mechanisms", "multicast routing", "transmission latencies", "unlinkability", "Crowds", "Onion Routing", "performance"]} -{"id": "1751", "title": "An adaptive time step procedure for a parabolic problem with blow-up", "abstract": "In this paper we introduce and analyze a fully discrete approximation for a parabolic problem with a nonlinear boundary condition which implies that the solutions blow up in finite time. We use standard linear elements with mass lumping for the space variable. For the time discretization we write the problem in an equivalent form which is obtained by introducing an appropriate time re-scaling and then, we use explicit Runge-Kutta methods for this equivalent problem. In order to motivate our procedure we present it first in the case of a simple ordinary differential equation and show how the blow up time is approximated in this case. We obtain necessary and sufficient conditions for the blowup of the numerical solution and prove that the numerical blow-up time converges to the continuous one. We also study, for the explicit Euler approximation, the localization of blow-up points for the numerical scheme", "keyphrases": ["adaptive time step procedure", "parabolic problem", "fully discrete approximation", "nonlinear boundary condition", "standard linear elements", "Runge-Kutta methods", "explicit Euler approximation"]} -{"id": "1609", "title": "Modeling undesirable factors in efficiency evaluation", "abstract": "Data envelopment analysis (DEA) measures the relative efficiency of decision making units (DMUs) with multiple performance factors which are grouped into outputs and inputs. Once the efficient frontier is determined, inefficient DMUs can improve their performance to reach the efficient frontier by either increasing their current output levels or decreasing their current input levels. However, both desirable (good) and undesirable (bad) factors may be present. For example, if inefficiency exists in production processes where final products are manufactured with a production of wastes and pollutants, the outputs of wastes and pollutants are undesirable and should be reduced to improve the performance. Using the classification invariance property, we show that the standard DEA model can be used to improve the performance via increasing the desirable outputs and decreasing the undesirable outputs. The method can also be applied to situations when some inputs need to be increased to improve the performance. The linearity and convexity of DEA are preserved through our proposal", "keyphrases": ["data envelopment analysis", "decision making units", "multiple performance factors", "efficient frontier", "current output levels", "current input levels", "production processes", "final product manufacture", "wastes", "pollutants", "classification invariance property", "desirable outputs", "undesirable outputs", "linear programming", "efficiency evaluation", "undesirable factor modeling"]} -{"id": "1922", "title": "Trends in agent communication language", "abstract": "Agent technology is an exciting and important new way to create complex software systems. Agents blend many of the traditional properties of AI programs - knowledge-level reasoning, flexibility, proactiveness, goal-directedness, and so forth - with insights gained from distributed software engineering, machine learning, negotiation and teamwork theory, and the social sciences. An important part of the agent approach is the principle that agents (like humans) can function more effectively in groups that are characterized by cooperation and division of labor. Agent programs are designed to autonomously collaborate with each other in order to satisfy both their internal goals and the shared external demands generated by virtue of their participation in agent societies. This type of collaboration depends on a sophisticated system of inter-agent communication. The assumption that inter-agent communication is best handled through the explicit use of an agent communication language (ACL) underlies each of the articles in this special issue. In this introductory article, we will supply a brief background and introduction to the main topics in agent communication", "keyphrases": ["agent technology", "AI programs", "agent communication language", "inter-agent communication", "agent societies", "KQML", "semantics", "conversations", "distributed software engineering", "machine learning", "negotiation", "teamwork", "social sciences"]} -{"id": "1508", "title": "Rats, robots, and rescue", "abstract": "In early May, media inquiries started arriving at my office at the Center for Robot-Assisted Search and Rescue (www.crasar.org). Because I'm CRASAR's director, I thought the press was calling to follow up on the recent humanitarian award given to the center's founder, John Blitch, for successfully using small, backpackable robots at the World Trade Center disaster. Instead, I found they were asking me to comment on the \"roborats\" study in the 2 May 2002 Nature. In this study, rats with medial force brain implants underwent operant conditioning to force them into a form of guided behavior, one aspect of which was thought useful for search and rescue. The article's closing comment suggested that a guided rat could serve as both a mobile robot and a biological sensor. Although a roboticist by training, I'm committed to any technology that will help save lives while reducing the risk to rescuers. But rats?", "keyphrases": ["mobile robot", "biological sensor", "guided rat", "robot-assisted search and rescue"]} \ No newline at end of file +{"id": "1833", "title": "British Standard 7666 as a framework for geocoding land and property information the UK", "abstract": "The article examines the role of British Standard 7666 in the development of a national framework for geocoding land and property information in the United Kingdom. The author assesses how local authorities, and other agencies concerned with property and address datasets, are coping with the introduction of British Standard 7666, and examines the prospects and limitations of this development. British Standard 7666 has four parts, comprising specifications for street gazetteer; land and property gazetteer; addresses; and public rights of way. The organisation coordinating the introduction of British Standard 7666, Improvement and Development Agency (IDeA), is also overseeing the development and maintenance of a National Land and Property Gazetteer (NLPG) based on British Standard 7666. The introduction of the new addressing standard has mainly been prompted by Britain's effort to set up a national cadastral service to replace the obsolescent property registration system currently in place", "keyphrases": ["British Standard 7666", "geocoding", "property information", "land information", "UK", "national framework", "United Kingdom", "local authorities", "address datasets", "property datasets", "street gazetteer", "property gazetteer", "land gazetteer", "addresses", "public rights of way", "Improvement and Development Agency", "IDeA", "National Land and Property Gazetteer", "NLPG", "addressing standard", "national cadastral service", "property registration system", "land information systems"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "R", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "1876", "title": "The development of CASC [automated theorem proving]", "abstract": "Researchers who make theoretical advances also need some way to demonstrate that an advance really does have general, overall positive consequences for system performance. For this it is necessary to evaluate the system on a set of problems that is sufficiently large and diverse to be somehow representative of the intended application area as a whole. It is only a small step from system evaluation to a communal system competition. The CADE ATP System Competition (CASC) has been run annually since 1996. Any competition is difficult to design and organize in the first instance, and to then run over the years. In order to obtain the full benefits of a competition, a thoroughly organized event, with an unambiguous and motivated design, is necessary. For some issues relevant to the CASC design, inevitable constraints have emerged. For other issues there have been several choices, and decisions have had to be made. This paper describes the evolution of CASC, paying particular attention to its design, design changes, and organization", "keyphrases": ["CASC", "system performance", "system evaluation", "automated deduction", "CADE ATP System Competition", "automated theorem proving", "AI", "artificial intelligence", "classical first order logic"], "prmu": ["P", "P", "P", "M", "P", "P", "U", "U", "M"]} +{"id": "1605", "title": "A GRASP heuristic for the mixed Chinese postman problem", "abstract": "Arc routing problems (ARPs) consist of finding a traversal on a graph satisfying some conditions related to the links of the graph. In the Chinese postman problem (CPP) the aim is to find a minimum cost tour (closed walk) traversing all the links of the graph at least once. Both the Undirected CPP, where all the links are edges that can be traversed in both ways, and the Directed CPP, where all the links are arcs that must be traversed in a specified way, are known to be polynomially solvable. However, if we deal with a mixed graph (having edges and arcs), the problem turns out to be NP-hard. In this paper, we present a heuristic algorithm for this problem, the so-called Mixed CPP (MCPP), based on greedy randomized adaptive search procedure (GRASP) techniques. The algorithm has been tested and compared with other known and recent methods from the literature on a wide collection of randomly generated instances, with up to 200 nodes and 600 links, producing encouraging computational results. As far as we know, this is the best heuristic algorithm for the MCPP, with respect to solution quality, published up to now", "keyphrases": ["mixed Chinese postman problem", "GRASP heuristic", "arc routing problems", "graph traversal", "minimum cost tour", "closed walk", "NP-hard problem", "heuristic algorithm", "greedy randomized adaptive search procedure", "optimization problems", "metaheuristics"], "prmu": ["P", "P", "P", "R", "P", "P", "R", "P", "P", "M", "U"]} +{"id": "1640", "title": "Integration is LIMS inspiration", "abstract": "For software manufacturers, blessings come in the form of fast-moving application areas. In the case of LIMS, biotechnology is still in the driving seat, inspiring developers to maintain consistently rapid and creative levels of innovation. Current advancements are no exception. Integration and linking initiatives are still popular and much of the activity appears to be coming from a very productive minority", "keyphrases": ["software manufacturers", "LIMS", "biotechnology"], "prmu": ["P", "P", "P"]} +{"id": "151", "title": "Extending CTL with actions and real time", "abstract": "In this paper, we present the logic ATCTL, which is intended to be used for model checking models that have been specified in a lightweight version of the Unified Modelling Language (UML). Elsewhere, we have defined a formal semantics for LUML to describe the models. This paper's goal is to give a specification language for properties that fits LUML; LUML includes states, actions and real time. ATCTL extends CTL with concurrent actions and real time. It is based on earlier extensions of CTL by R. De Nicola and F. Vaandrager (ACTL) (1990) and R. Alur et aL (TCTL) (1993). This makes it easier to adapt existing model checkers to ATCTL. To show that we can check properties specified in ATCTL in models specified in LUML, we give a small example using the Kronos model checker", "keyphrases": ["actions", "real time logic", "logic ATCTL", "model checking models", "Unified Modelling Language", "formal semantics", "specification language", "Kronos model checker", "computation tree logic"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1504", "title": "Designing human-centered distributed information systems", "abstract": "Many computer systems are designed according to engineering and technology principles and are typically difficult to learn and use. The fields of human-computer interaction, interface design, and human factors have made significant contributions to ease of use and are primarily concerned with the interfaces between systems and users, not with the structures that are often more fundamental for designing truly human-centered systems. The emerging paradigm of human-centered computing (HCC)-which has taken many forms-offers a new look at system design. HCC requires more than merely designing an artificial agent to supplement a human agent. The dynamic interactions in a distributed system composed of human and artificial agents-and the context in which the system is situated-are indispensable factors. While we have successfully applied our methodology in designing a prototype of a human-centered intelligent flight-surgeon console at NASA Johnson Space Center, this article presents a methodology for designing human-centered computing systems using electronic medical records (EMR) systems", "keyphrases": ["human-centered distributed information systems design", "distributed cognition", "artificial agents", "human agents", "multiple analysis levels", "human-computer interaction", "interface design", "human factors", "human-centered computing systems", "human-centered intelligent flight surgeon console", "NASA Johnson Space Center", "electronic medical records systems"], "prmu": ["R", "M", "P", "P", "U", "P", "P", "P", "P", "M", "P", "R"]} +{"id": "1541", "title": "The AT89C51/52 flash memory programmers", "abstract": "When faced with a plethora of applications to design, it's essential to have a versatile microcontroller in hand. The author describes the AT89C51/52 microcontrollers. To get you started, he'll describe his inexpensive microcontroller programmer", "keyphrases": ["AT89C51/52", "flash memory programmers", "microcontrollers", "device programmer", "microcontroller programmer"], "prmu": ["P", "P", "P", "M", "P"]} +{"id": "1680", "title": "Minimizing weighted number of early and tardy jobs with a common due window involving location penalty", "abstract": "Studies a single machine scheduling problem to minimize the weighted number of early and tardy jobs with a common due window. There are n non-preemptive and simultaneously available jobs. Each job will incur an early (tardy) penalty if it is early (tardy) with respect to the common due window under a given schedule. The window size is a given parameter but the window location is a decision variable. The objective of the problem is to find a schedule that minimizes the weighted number of early and tardy jobs and the location penalty. We show that the problem is NP-complete in the ordinary sense and develop a dynamic programming based pseudo-polynomial algorithm. We conduct computational experiments, the results of which show that the performance of the dynamic algorithm is very good in terms of memory requirement and CPU time. We also provide polynomial time algorithms for two special cases", "keyphrases": ["early jobs", "tardy jobs", "common due window", "single machine scheduling problem", "decision variable", "location penalty", "NP-complete problem", "dynamic programming", "pseudo-polynomial algorithm"], "prmu": ["R", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1539", "title": "Comments on some recent methods for the simultaneous determination of polynomial zeros", "abstract": "In this note we give some comments on the recent results concerning a simultaneous method of the fourth-order for finding complex zeros in circular interval arithmetic. The main discussion is directed to a rediscovered iterative formula and its modification, presented recently in Sun and Kosmol, (2001). The presented comments include some critical parts of the papers Petkovic, Trickovic, Herceg, (1998) and Sun and Kosmol, (2001) which treat the same subject", "keyphrases": ["polynomial", "zeros", "complex zeros", "circular interval arithmetic", "iterative formula"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1913", "title": "A six-degree-of-freedom precision motion stage", "abstract": "This article presents the design and performance evaluation of a six-degree-of-freedom piezoelectrically actuated fine motion stage that will be used for three dimensional error compensation of a long-range translation mechanism. Development of a single element, piezoelectric linear displacement actuator capable of translations of 1.67 mu m with 900 V potential across the electrodes and under a 27.4 N axial load and 0.5 mm lateral distortion is presented. Finite element methods have been developed and used to evaluate resonant frequencies of the stage platform and the complete assembly with and without a platform payload. In general, an error of approximately 10.0% between the finite element results and the experimentally measured values were observed. The complete fine motion stage provided approximately +or-0.93 mu m of translation and +or-38.0 mu rad of rotation in all three planes of motion using an excitation range of 1000 V. An impulse response indicating a fundamental mode resonance at 162 Hz was measured with a 0.650 kg payload rigidly mounted to the top of the stage", "keyphrases": ["six-degree-of-freedom precision motion stage", "performance evaluation", "design", "piezoelectrically actuated fine motion stage", "3D error compensation", "long-range translation mechanism", "single element piezoelectric linear displacement actuator", "finite element methods", "resonant frequency", "stage platform", "platform payload", "impulse response", "fundamental mode resonance", "1.67 micron", "900 V", "1000 V", "0.93 to -0.93 micron", "162 Hz", "650.0 gm"], "prmu": ["P", "P", "P", "P", "M", "P", "R", "P", "P", "P", "P", "P", "P", "M", "P", "R", "M", "P", "U"]} +{"id": "191", "title": "Linear, parameter-varying control and its application to a turbofan engine", "abstract": "This paper describes application of parameter-dependent control design methods to a turbofan engine. Parameter-dependent systems are linear systems, whose state-space descriptions are known functions of time-varying parameters. The time variation of each of the parameters is not known in advance, but is assumed to be measurable in real-time. Three linear, parameter-varying (LPV) approaches to control design are discussed. The first method is based on linear fractional transformations which relies on the small gain theorem for bounds on performance and robustness. The other methods make use of either a single (SQLF) or parameter-dependent (PDQLF) quadratic Lyapunov function to bound the achievable level of performance. The latter two techniques are used to synthesize controllers for a high-performance turbofan engine. A LPV model of the turbofan engine is constructed from Jacobian linearizations at fixed power codes for control design. The control problem is formulated as a model matching problem in the H/sub infinity / and LPV framework. The objective is decoupled command response of the closed-loop system to pressure and rotor speed requests. The performance of linear, H/sub infinity / point designs are compared with the SQLF and PDQLF controllers. Nonlinear simulations indicate that the controller synthesized using the SQLF approach is slightly more conservative than the PDQLF controller. Nonlinear simulations with the SQLF and PDQLF controllers show very robust designs that achieve all desired performance objectives", "keyphrases": ["turbofan engine", "linear parameter-varying control", "parameter-dependent control design methods", "state-space descriptions", "time-varying parameters", "linear fractional transformations", "small gain theorem", "performance bounds", "robustness bounds", "single quadratic Lyapunov function", "parameter-dependent quadratic Lyapunov function", "Jacobian linearizations", "decoupled command response", "closed-loop system", "model matching problem", "H/sub infinity / framework", "nonlinear simulations", "very robust designs"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "R", "R", "R", "R", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1638", "title": "The chemical brotherhood", "abstract": "It has always been more difficult for chemistry to keep up in the Internet age but a new language could herald a new era for the discipline. The paper discusses CML, or chemical mark-up language. The eXtensible Mark-up Language provides a universal format for structured documents and data on the Web and so offers a way for scientists and others to carry a wide range of information types across the net in a transparent way. All that is needed is an XML browser", "keyphrases": ["chemistry", "Internet", "CML", "chemical mark-up language", "eXtensible Mark-up Language", "structured document format", "World Wide Web", "XML browser"], "prmu": ["P", "P", "P", "P", "P", "R", "M", "P"]} +{"id": "1760", "title": "Dihedral congruence primes and class fields of real quadratic fields", "abstract": "We show that for a real quadratic field F the dihedral congruence primes with respect to F for cusp forms of weight k and quadratic nebentypus are essentially the primes dividing expressions of the form epsilon /sub +//sup k-1/+or-1 where epsilon /sub +/ is a totally positive fundamental unit of F. This extends work of Hida. Our results allow us to identify a family of (ray) class fields of F which are generated by torsion points on modular abelian varieties", "keyphrases": ["dihedral congruence primes", "class fields", "real quadratic fields", "quadratic nebentypus", "torsion points", "modular abelian varieties", "class field theory"], "prmu": ["P", "P", "P", "P", "P", "P", "M"]} +{"id": "1725", "title": "Cutting the cord [wireless health care]", "abstract": "More and more healthcare executives are electing to cut the cord to their existing computer systems by implementing mobile technology. The allure of information anywhere, anytime is intoxicating, demonstrated by the cell phones and personal digital assistants (PDAs) that adorn today's professionals. The utility and convenience of these devices is undeniable. But what is the best strategy for implementing a mobile solution within a healthcare enterprise, be it large or small-and under what circumstances? What types of healthcare workers benefit most from mobile technology? And how state-of-the-art is security for wireless applications and devices? These are the questions that healthcare executives are asking-and should be asking-as they evaluate mobile solutions", "keyphrases": ["healthcare", "mobile computing", "wireless computing", "security"], "prmu": ["P", "R", "R", "P"]} +{"id": "1461", "title": "Adaptive multiresolution approach for solution of hyperbolic PDEs", "abstract": "This paper establishes an innovative and efficient multiresolution adaptive approach combined with high-resolution methods, for the numerical solution of a single or a system of partial differential equations. The proposed methodology is unconditionally bounded (even for hyperbolic equations) and dynamically adapts the grid so that higher spatial resolution is automatically allocated to domain regions where strong gradients are observed, thus possessing the two desired properties of a numerical approach: stability and accuracy. Numerical results for five test problems are presented which clearly show the robustness and cost effectiveness of the proposed method", "keyphrases": ["multiresolution adaptive approach", "high-resolution methods", "numerical solution", "hyperbolic partial differential equations", "dynamic grid adaptation", "unconditionally bounded methodology", "spatial resolution", "strong gradients", "stability", "accuracy", "robustness", "cost effectiveness"], "prmu": ["P", "P", "P", "R", "R", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1659", "title": "Mobile commerce: transforming the vision into reality", "abstract": "This editorial preface investigates current developments in mobile commerce (M-commerce) and proposes an integrated architecture that supports business and consumer needs in an optimal way to successfully implement M-commerce business processes. The key line of thought is based on the heuristic observation that customers will not want to receive M-commerce offerings to their mobile telephones. As a result, a pull as opposed to a push approach becomes a necessary requirement to conduct M-commerce. In addition, M-commerce has to rely on local, regional, demographic and many other variables to be truly effective. Both observations necessitate an M-commerce architecture that allows the coherent integration of enterprise-level systems as well as the aggregation of product and service offerings from many different and partially competing parties into a collaborative M-commerce platform. The key software component within this integrated architecture is an event management engine to monitor, detect, store, process and measure information about outside events that are relevant to all participants in M-commerce", "keyphrases": ["M-commerce", "mobile commerce", "integrated architecture", "consumer needs", "business needs", "mobile telephones", "pull approach", "collaborative platform", "event management engine"], "prmu": ["P", "P", "P", "P", "R", "P", "R", "R", "P"]} +{"id": "148", "title": "Axioms for branching time", "abstract": "Logics of general branching time, or historical necessity, have long been studied but important axiomatization questions remain open. Here the difficulties of finding axioms for such logics are considered and ideas for solving some of the main open problems are presented. A new, more expressive logical account is also given to support Peirce's prohibition on truth values being attached to the contingent future", "keyphrases": ["axioms", "branching time", "truth values", "temporal logic"], "prmu": ["P", "P", "P", "M"]} +{"id": "1558", "title": "Orthogonality of the Jacobi polynomials with negative integer parameters", "abstract": "It is well known that the Jacobi polynomials P/sub n//sup ( alpha , beta )/(x) are orthogonal with respect to a quasi-definite linear functional whenever alpha , beta , and alpha + beta + 1 are not negative integer numbers. Recently, Sobolev orthogonality for these polynomials has been obtained for alpha a negative integer and beta not a negative integer and also for the case alpha = beta negative integer numbers. In this paper, we give a Sobolev orthogonality for the Jacobi polynomials in the remainder cases", "keyphrases": ["orthogonality", "quasi-definite linear functional", "Sobolev orthogonality", "Jacobi polynomials", "negative integer parameters"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1892", "title": "Closed-loop persistent identification of linear systems with unmodeled dynamics and stochastic disturbances", "abstract": "The essential issues of time complexity and probing signal selection are studied for persistent identification of linear time-invariant systems in a closed-loop setting. By establishing both upper and lower bounds on identification accuracy as functions of the length of observation, size of unmodeled dynamics, and stochastic disturbances, we demonstrate the inherent impact of unmodeled dynamics on identification accuracy, reduction of time complexity by stochastic averaging on disturbances, and probing capability of full rank periodic signals for closed-loop persistent identification. These findings indicate that the mixed formulation, in which deterministic uncertainty of system dynamics is blended with random disturbances, is beneficial to reduction of identification complexity", "keyphrases": ["closed-loop persistent identification", "unmodeled dynamics", "linear time-invariant systems", "upper bounds", "lower bounds", "identification accuracy", "full rank periodic signals", "stochastic disturbances", "time complexity", "probing signal selection"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1744", "title": "Convergence of Toland's critical points for sequences of DC functions and application to the resolution of semilinear elliptic problems", "abstract": "We prove that if a sequence (f/sub n/)/sub n/ of DC functions (difference of two convex functions) converges to a DC function f in some appropriate way and if u/sub n/ is a critical point of f/sub n/, in the sense described by Toland (1978, 1979), and is such that (u/sub n/)/sub n/ converges to u, then u is a critical point of f, still in Toland's sense. We also build a new algorithm which searches for this critical point u and then apply it in order to compute the solution of a semilinear elliptic equation", "keyphrases": ["critical point convergence", "DC function sequences", "semilinear elliptic problems", "convex function difference", "semilinear elliptic equation"], "prmu": ["R", "R", "P", "R", "P"]} +{"id": "1701", "title": "Estimation of 3-D left ventricular deformation from medical images using biomechanical models", "abstract": "The quantitative estimation of regional cardiac deformation from three-dimensional (3-D) image sequences has important clinical implications for the assessment of viability in the heart wall. We present here a generic methodology for estimating soft tissue deformation which integrates image-derived information with biomechanical models, and apply it to the problem of cardiac deformation estimation. The method is image modality independent. The images are segmented interactively and then initial correspondence is established using a shape-tracking approach. A dense motion field is then estimated using a transversely isotropic, linear-elastic model, which accounts for the muscle fiber directions in the left ventricle. The dense motion field is in turn used to calculate the deformation of the heart wall in terms of strain in cardiac specific directions. The strains obtained using this approach in open-chest dogs before and after coronary occlusion, exhibit a high correlation with strains produced in the same animals using implanted markers. Further, they show good agreement with previously published results in the literature. This proposed method provides quantitative regional 3-D estimates of heart deformation", "keyphrases": ["3-D left ventricular deformation estimation", "medical diagnostic imaging", "biomechanical models", "regional cardiac deformation", "quantitative estimation", "transversely isotropic linear-elastic model", "cardiac specific directions", "open-chest dogs", "muscle fiber directions", "generic methodology", "interactively segmented images", "3-D image sequences", "nonrigid motion estimation", "magnetic resonance imaging", "left ventricular motion estimation"], "prmu": ["R", "M", "P", "P", "P", "R", "P", "P", "P", "P", "R", "R", "M", "M", "R"]} +{"id": "1779", "title": "Maybe it's not too late to join the circus: books for midlife career management", "abstract": "Midcareer librarians looking for career management help on the bookshelf face thousands of choices. This article reviews thirteen popular career self-help books. The reviewed books cover various aspects of career management and provide information on which might be best suited for particular goals, including career change, career tune-up, and personal and professional self-evaluation. The comments reflect issues of interest to midcareer professionals", "keyphrases": ["midlife career management", "librarians", "career self-help books", "career change", "professional self-evaluation", "personal self-evaluation", "libraries"], "prmu": ["P", "P", "P", "P", "P", "R", "U"]} +{"id": "1817", "title": "Nonlinear adaptive control via sliding-mode state and perturbation observer", "abstract": "The paper presents a nonlinear adaptive controller (NAC) for single-input single-output feedback linearisable nonlinear systems. A sliding-mode state and perturbation observer is designed to estimate the system states and perturbation which includes the combined effect of system nonlinearities, uncertainties and external disturbances. The NAC design does not require the details of the nonlinear system model and full system states. It possesses an adaptation capability to deal with system parameter uncertainties, unmodelled system dynamics and external disturbances. The convergence of the observer and the stability analysis of the controller/observer system are given. The proposed control scheme is applied for control of a synchronous generator, in comparison with a state-feedback linearising controller (FLC). Simulation study is carried out based on a single-generator infinite-bus power system to show the performance of the controller/observer system", "keyphrases": ["nonlinear adaptive control", "sliding-mode state observer", "perturbation observer", "NAC", "SISO feedback linearisable nonlinear systems", "parameter uncertainties", "unmodelled system dynamics", "external disturbances", "convergence", "synchronous generator control", "state-feedback linearising controller", "FLC", "single-generator infinite-bus power system"], "prmu": ["P", "R", "P", "P", "M", "P", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "1485", "title": "Telemedicine in the management of a cervical dislocation by a mobile neurosurgeon", "abstract": "Neurosurgical teams, who are normally located in specialist centres, frequently use teleradiology to make a decision about the transfer of a patient to the nearest neurosurgical department. This decision depends on the type of pathology, the clinical status of the patient and the prognosis. If the transfer of the patient is not possible, for example because of an unstable clinical status, a mobile neurosurgical team may be used. We report a case which was dealt with in a remote French military airborne surgical unit, in the Republic of Chad. The unit, which provides health-care to the French military personnel stationed there, also provides free medical care for the local population. It conducts about 100 operations each month. The unit comprises two surgeons (an orthopaedic and a general surgeon), one anaesthetist, two anaesthetic nurses, one operating room nurse, two nurses, three paramedics and a secretary. The civilian patient presented with unstable cervical trauma. A mobile neurosurgeon operated on her, and used telemedicine before, during and after surgery", "keyphrases": ["cervical dislocation management", "mobile neurosurgeon", "teleradiology", "telemedicine", "remote French military airborne surgical unit", "Republic of Chad", "health care", "French military personnel", "civilian patient", "unstable cervical trauma", "surgery"], "prmu": ["R", "P", "P", "P", "P", "P", "M", "P", "P", "P", "P"]} +{"id": "1852", "title": "The design and performance evaluation of alternative XML storage strategies", "abstract": "This paper studies five strategies for storing XML documents including one that leaves documents in the file system, three that use a relational database system, and one that uses an object manager. We implement and evaluate each approach using a number of XQuery queries. A number of interesting insights are gained from these experiments and a summary of the advantages and disadvantages of the approaches is presented", "keyphrases": ["XML document storage", "file system", "relational database system", "object manager", "performance evaluation", "XQuery queries"], "prmu": ["R", "P", "P", "P", "P", "P"]} +{"id": "1478", "title": "The effects of work pace on within-participant and between-participant keying force, electromyography, and fatigue", "abstract": "A laboratory study was conducted to determine the effects of work pace on typing force, electromyographic (EMG) activity, and subjective discomfort. We found that as participants typed faster, their typing force and finger flexor and extensor EMG activity increased linearly. There was also an increase in subjective discomfort, with a sharp threshold between participants' self-selected pace and their maximum typing speed. The results suggest that participants self-select a typing pace that maximizes typing speed and minimizes discomfort. The fastest typists did not produce significantly more finger flexor EMG activity but did produce proportionately less finger extensor EMG activity compared with the slower typists. We hypothesize that fast typists may use different muscle recruitment patterns that allow them to be more efficient than slower typists at striking the keys. In addition, faster typists do not experience more discomfort than slow typists. These findings show that the relative pace of typing is more important than actual typing speed with regard to discomfort and muscle activity. These results suggest that typists may benefit from skill training to increase maximum typing speed. Potential applications of this research includes skill training for typists", "keyphrases": ["work pace effect", "EMG activity", "subjective discomfort", "finger flexor", "typing speed", "discomfort", "typists", "muscle recruitment patterns", "keying force", "skill training"], "prmu": ["R", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1784", "title": "CyberEthics bibliography 2002: a select list of recent works", "abstract": "Included in the 2002 annual bibliography update is a select list of recent books and conference proceedings that have been published since 2000. Also included is a select list of special issues of journals and periodicals that were recently published. For additional lists of recently published books and articles, see ibid. (June 2000, June 2001)", "keyphrases": ["CyberEthics bibliography", "2002 annual bibliography", "recent books", "conference proceedings", "special issues", "journals", "periodicals"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "175", "title": "Diagnostic expert system using non-monotonic reasoning", "abstract": "The objective of this work is to develop an expert system for cucumber disorder diagnosis using non-monotonic reasoning to handle the situation when the system cannot reach a conclusion. One reason for this situation is when the information is incomplete. Another reason is when the domain knowledge itself is incomplete. Another reason is when the information is inconsistent. This method maintains the truth of the system in case of changing a piece of information. The proposed method uses two types of non-monotonic reasoning namely: default reasoning and reasoning in the presence of inconsistent information to achieve its goal", "keyphrases": ["diagnostic expert system", "nonmonotonic reasoning", "cucumber disorder diagnosis", "incomplete information", "inconsistent information", "truth maintenance", "default reasoning", "agriculture"], "prmu": ["P", "M", "P", "R", "P", "M", "P", "U"]} +{"id": "1520", "title": "Uniform hyperbolic polynomial B-spline curves", "abstract": "This paper presents a new kind of uniform splines, called hyperbolic polynomial B-splines, generated over the space Omega =span{sinh t, cosh t, t/sup k-3/, t/sup k-3/, t/sup k-4/, ..., t 1} in which k is an arbitrary integer larger than or equal to 3. Hyperbolic polynomial B-splines share most of the properties of B-splines in polynomial space. We give subdivision formulae for this new kind of curve and then prove that they have variation diminishing properties and the control polygons of the subdivisions converge. Hyperbolic polynomial B-splines can handle freeform curves as well as remarkable curves such as the hyperbola and the catenary. The generation of tensor product surfaces using these new splines is straightforward. Examples of such tensor product surfaces: the saddle surface, the catenary cylinder, and a certain kind of ruled surface are given", "keyphrases": ["uniform hyperbolic polynomial B-spline curves", "arbitrary integer", "subdivision formulae", "control polygons", "subdivisions", "freeform curves", "hyperbola", "catenary", "tensor product surface generation", "saddle surface", "catenary cylinder", "ruled surface"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "1565", "title": "On lag windows connected with Jacobi polynomials", "abstract": "Lag windows whose corresponding spectral windows are Jacobi polynomials or sums of Jacobi polynomials are introduced. The bias and variance of their spectral density estimators are investigated and their window bandwidth and characteristic exponent are determined", "keyphrases": ["lag windows", "Jacobi polynomials", "spectral windows", "spectral density estimators", "window bandwidth", "characteristic exponent"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1699", "title": "Time-domain reconstruction for thermoacoustic tomography in a spherical geometry", "abstract": "Reconstruction-based microwave-induced thermoacoustic tomography in a spherical configuration is presented. Thermoacoustic waves from biological tissue samples excited by microwave pulses are measured by a wide-band unfocused ultrasonic transducer, which is set on a spherical surface enclosing the sample. Sufficient data are acquired from different directions to reconstruct the microwave absorption distribution. An exact reconstruction solution is derived and approximated to a modified backprojection algorithm. Experiments demonstrate that the reconstructed images agree well with the original samples. The spatial resolution of the system reaches 0.5 mm", "keyphrases": ["medical diagnostic imaging", "thermoacoustic tomography", "time-domain reconstruction", "modified backprojection algorithm", "exact reconstruction solution", "biological tissue samples", "wide-band unfocused ultrasonic transducer", "spherical surface enclosing sample", "reconstructed images", "system spatial resolution", "spherical geometry", "0.5 mm"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P"]} +{"id": "1621", "title": "Current-mode fully-programmable piece-wise-linear block for neuro-fuzzy applications", "abstract": "A new method to implement an arbitrary piece-wise-linear characteristic in current mode is presented. Each of the breaking points and each slope is separately controllable. As an example a block that implements an N-shaped piece-wise-linearity has been designed. The N-shaped block operates in the subthreshold region and uses only ten transistors. These characteristics make it especially suitable for large arrays of neuro-fuzzy systems where the number of transistors and power consumption per cell is an important concern. A prototype of this block has been fabricated in a 0.35 mu m CMOS technology. The functionality and programmability of this circuit has been verified through experimental results", "keyphrases": ["arbitrary piece-wise-linear characteristic", "current mode", "breaking points", "separately controllable", "N-shaped piece-wise-linearity", "VLSI", "subthreshold region", "neuro-fuzzy systems", "power consumption", "CMOS", "0.35 micron"], "prmu": ["P", "P", "P", "P", "P", "U", "P", "P", "P", "P", "M"]} +{"id": "1664", "title": "Disappointment reigns [retail IT]", "abstract": "CPFR remains at the forefront of CIOs' minds, but a number of barriers, such as secretive corporate cultures and spotty data integrity, stand between retail organizations and true supply-chain collaboration. CIOs remain vexed at these obstacles, as was evidenced at a roundtable discussion by retail and consumer-goods IT leaders at the Retail Systems 2002 conference, held in Chicago by the consultancy MoonWatch Media Inc., Newton Upper Falls, Mass. Other annoyances discussed by retail CIOs include poorly designed business processes and retail's poor image with the IT talent emerging from school into the job market", "keyphrases": ["retail", "MoonWatch Media", "Retail Systems 2002 conference", "CIOs", "collaborative planning forecasting and replenishment"], "prmu": ["P", "P", "P", "P", "M"]} +{"id": "1598", "title": "A decision support model for selecting product/service benefit positionings", "abstract": "The art (and science) of successful product/service positioning generally hinges on the firm's ability to select a set of attractively priced consumer benefits that are: valued by the buyer, distinctive in one or more respects, believable, deliverable, and sustainable (under actual or potential competitive abilities to imitate, neutralize, or overcome) in the target markets that the firm selects. For many years, the ubiquitous quadrant chart has been used to provide a simple graph of product/service benefits (usually called product/service attributes) described in terms of consumers' perceptions of the importance of attributes (to brand/supplier choice) and the performance of competing firms on these attributes. This paper describes a model that extends the quadrant chart concept to a decision support system that optimizes a firm's market share for a specified product/service. In particular, we describe a decision support model that utilizes relatively simple marketing research data on consumers' judged benefit importances, and supplier performances on these benefits to develop message components for specified target buyers. A case study is used to illustrate the model. The study deals with developing advertising message components for a relatively new entrant in the US air shipping market. We also discuss, more briefly, management reactions to application of the model to date, and areas for further research and model extension", "keyphrases": ["product/service benefit positionings", "decision support model", "attractively priced consumer benefits", "quadrant chart", "simple graph", "product/service attributes", "brand/supplier choice", "market share optimization", "marketing research data", "consumer judged benefit importances", "message components", "advertising message components", "US air shipping market", "management reactions", "advertising", "greedy heuristic", "optimal message design"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "P", "P", "P", "U", "M"]} +{"id": "188", "title": "Sampled-data implementation of a gain scheduled controller", "abstract": "A continuous-time gain-scheduled controller must be transformed to a corresponding discrete-time controller for sampled-data implementation. We show that certain linearization properties of a continuous-time gain scheduled controller are inherited by its sampled-data implementation. We also show that a similar relationship exists for multi-rate gain scheduled controllers arising in flight control applications", "keyphrases": ["gain scheduled controller", "sampled-data implementation", "continuous-time gain-scheduled controller", "discrete-time controller", "linearization properties", "multi-rate gain scheduled controllers", "flight control applications"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1786", "title": "A humane tool for aiding computer science advisors, computer science students, and parents", "abstract": "Over the past few years, the computer science department faculty at Baylor has observed that some students who perform adequately during the freshman and sophomore years have substantial difficulty during the junior and senior years of study. Baylor University is an institution committed to being caring of its students. The objective for this study grew out of these two realities. There are three objectives of this research. One objective is to identify students, no later than the sophomore year, who are less likely to succeed as computer science majors. A second objective is to accomplish this identification by using data from seniors majoring in computer science. A third objective is to begin to use this information at the end of their sophomore year when meeting with a computer science faculty advisor. A regression study is conducted on the data from all students classified as seniors, majoring in computer science in May 2001, showing grades in six freshman and sophomore courses, and showing grades for at least five junior or senior level computer science courses. These students and their course performance data constituted the study sample", "keyphrases": ["humane tool", "computer science advisors", "computer science students", "parents", "Baylor University", "student care", "sophomore year", "computer science majors", "regression study", "course performance data"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1815", "title": "Control of integral processes with dead-time. 1. Disturbance observer-based 2 DOF control scheme", "abstract": "A disturbance observer-based control scheme (a version of 2 DOF internal model control) which is very effective in controlling integral processes with dead time is presented. The controller can be designed to reject ramp disturbances as well as step disturbances and even arbitrary disturbances. When the plant model is available only two parameters are left to tune. One is the time constant of the set-point response and the other is the time constant of the disturbance response. The latter is tuned according to the compromise between disturbance response and robustness. This control scheme has a simple, clear, easy-to-design, easy-to-implement structure and good performance. It is compared to the best results (so far) using some simulation examples", "keyphrases": ["integral processes", "dead-time", "disturbance observer-based 2 DOF control scheme", "2 DOF internal model control", "ramp disturbances rejection", "set-point response", "time constant", "disturbance response", "robustness"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1487", "title": "Assessment of prehospital chest pain using telecardiology", "abstract": "Two hundred general practitioners were equipped with a portable electrocardiograph which could transmit a 12-lead electrocardiogram (ECG) via a telephone line. A cardiologist was available 24 h a day for an interactive teleconsultation. In a 13 month period there were 5073 calls to the telecardiology service and 952 subjects with chest pain were identified. The telecardiology service allowed the general practitioners to manage 700 cases (74%) themselves; further diagnostic tests were requested for 162 patients (17%) and 83 patients (9%) were sent to the hospital emergency department. In the last group a cardiological diagnosis was confirmed in 60 patients and refuted in 23. Seven patients in whom the telecardiology service failed to detect a cardiac problem were hospitalized in the subsequent 48 h. The telecardiology service showed a sensitivity of 97.4%, a specificity of 89.5% and a diagnostic accuracy of 86.9% for chest pain. Telemedicine could be a useful tool in the diagnosis of chest pain in primary care", "keyphrases": ["prehospital chest pain assessment", "telecardiology", "general practitioners", "portable electrocardiograph", "electrocardiogram transmission", "telephone line", "interactive teleconsultation", "patients", "diagnostic tests", "hospital emergency department", "sensitivity", "specificity", "diagnostic accuracy", "primary care", "13 month"], "prmu": ["R", "P", "P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1850", "title": "The n-tier hub technology", "abstract": "During 2001, the Enterprise Engineering Laboratory at George Mason University was contracted by the Boeing Company to develop an eHub capability for aerospace suppliers in Taiwan. In a laboratory environment, the core technology was designed, developed, and tested, and now a large first-tier aerospace supplier in Taiwan is commercializing the technology. The project objective was to provide layered network and application services for transporting XML-based business transaction flows across multi-tier, heterogeneous data processing environments. This paper documents the business scenario, the eHub application, and the network transport mechanisms that were used to build the n-tier hub. In contrast to most eHubs, this solution takes the point of view of suppliers, pushing data in accordance with supplier requirements; hence, enhancing the probability of supplier adoption. The unique contribution of this project is the development of an eHub that meets the needs of small and medium enterprises (SMEs) and first-tier suppliers", "keyphrases": ["n-tier hub technology", "aerospace suppliers", "Boeing Company", "Taiwan", "XML-based business transaction flows", "multi-tier heterogeneous data processing environments", "business scenario", "network transport mechanisms", "supplier adoption", "small and medium enterprises", "first-tier suppliers"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "P"]} +{"id": "1908", "title": "Explicit solutions for transcendental equations", "abstract": "A simple method to formulate an explicit expression for the roots of any analytic transcendental function is presented. The method is based on Cauchy's integral theorem and uses only basic concepts of complex integration. A convenient method for numerically evaluating the exact expression is presented. The application of both the formulation and evaluation of the exact expression is illustrated for several classical root finding problems", "keyphrases": ["analytic functions", "transcendental equations", "Cauchy integral theorem", "complex integration", "root finding", "singularity", "polynomial", "Fourier transform"], "prmu": ["R", "P", "R", "P", "P", "U", "U", "U"]} +{"id": "1623", "title": "Transmission of real-time video over IP differentiated services", "abstract": "Multimedia applications require high bandwidth and guaranteed quality of service (QoS). The current Internet, which provides 'best effort' services, cannot meet the stringent QoS requirements for delivering MPEG videos. It is proposed that MPEG frames are transported through various service models of DiffServ. Performance analysis and simulation results show that the proposed approach can not only guarantee QoS but can also achieve high bandwidth utilisation", "keyphrases": ["IP differentiated services", "real-time video transmission", "multimedia applications", "quality of service", "QoS guarantees", "Internet", "MPEG video", "DiffServ", "high bandwidth utilisation"], "prmu": ["P", "R", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1666", "title": "Airline base schedule optimisation by flight network annealing", "abstract": "A system for rigorous airline base schedule optimisation is described. The architecture of the system reflects the underlying problem structure. The architecture is hierarchical consisting of a master problem for logical aircraft schedule optimisation and a sub-problem for schedule evaluation. The sub-problem is made up of a number of component sub-problems including connection generation, passenger choice modelling, passenger traffic allocation by simulation and revenue and cost determination. Schedule optimisation is carried out by means of simulated annealing of flight networks. The operators for the simulated annealing process are feasibility preserving and form a complete set of operators", "keyphrases": ["airline base schedule optimisation", "flight network annealing", "system architecture", "hierarchical architecture", "master problem", "logical aircraft schedule optimisation", "schedule evaluation", "connection generation", "passenger choice modelling", "passenger traffic allocation", "cost determination", "simulated annealing", "operators", "time complexity"], "prmu": ["P", "P", "R", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U"]} +{"id": "177", "title": "Turning telecommunications call details to churn prediction: a data mining approach", "abstract": "As deregulation, new technologies, and new competitors open up the mobile telecommunications industry, churn prediction and management has become of great concern to mobile service providers. A mobile service provider wishing to retain its subscribers needs to be able to predict which of them may be at-risk of changing services and will make those subscribers the focus of customer retention efforts. In response to the limitations of existing churn-prediction systems and the unavailability of customer demographics in the mobile telecommunications provider investigated, we propose, design, and experimentally evaluate a churn-prediction technique that predicts churning from subscriber contractual information and call pattern changes extracted from call details. This proposed technique is capable of identifying potential churners at the contract level for a specific prediction time-period. In addition, the proposed technique incorporates the multi-classifier class-combiner approach to address the challenge of a highly skewed class distribution between churners and non-churners. The empirical evaluation results suggest that the proposed call-behavior-based churn-prediction technique exhibits satisfactory predictive effectiveness when more recent call details are employed for the churn prediction model construction. Furthermore, the proposed technique is able to demonstrate satisfactory or reasonable predictive power within the one-month interval between model construction and churn prediction. Using a previous demographics-based churn-prediction system as a reference, the lift factors attained by our proposed technique appear largely satisfactory", "keyphrases": ["telecommunications call details", "mobile telecommunications industry", "mobile service providers", "deregulation", "customer retention efforts", "customer demographics", "subscriber contractual information", "call pattern changes", "multi-classifier class-combiner approach", "skewed class distribution", "lift factors", "decision tree induction"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U"]} +{"id": "1522", "title": "Waltzing through Port 80 [Web security]", "abstract": "Web services follow the trusting model of the Internet, but allow ever more powerful payloads to travel between businesses and consumers. Before you leap online, the author advises to scan the security concerns and the available fixes. He looks at how we define and store Web services and incorporate them into business processes", "keyphrases": ["Web services", "Internet", "trust", "data security", "business processes"], "prmu": ["P", "P", "P", "M", "P"]} +{"id": "1567", "title": "Asymptotic expansions for the zeros of certain special functions", "abstract": "We derive asymptotic expansions for the zeros of the cosine-integral Ci(x) and the Struve function H/sub 0/(x), and extend the available formulae for the zeros of Kelvin functions. Numerical evidence is provided to illustrate the accuracy of the expansions", "keyphrases": ["asymptotic expansions", "zeros", "cosine-integral", "Struve function", "Kelvin functions", "accuracy"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "18", "title": "Differential and integral calculus on discrete time series data", "abstract": "It has been found that discontinuity plays a crucial role in natural evolutions (Lin 1998). In this presentation, we will generalize the idea of integration and differentiation, we developed in calculus, to the study of time series in the hope that the problem of outliers and discontinuities can be resolved more successfully than simply deleting the outliers and avoiding discontinuities from the overall data analysis. In general, appearances of outliers tend to mean existence of discontinuities, explosive growth or decline in the evolution. At the same time, our approach can be employed to partially overcome the problem of not having enough data values in any available time series. At the end, we will look at some real-life problems of prediction in order to see the power of this new approach", "keyphrases": ["natural evolutions", "integration", "differentiation", "time series", "outliers", "prediction"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1828", "title": "Exploiting structure in quantified formulas", "abstract": "We study the computational problem \"find the value of the quantified formula obtained by quantifying the variables in a sum of terms.\" The \"sum\" can be based on any commutative monoid, the \"quantifiers\" need only satisfy two simple conditions, and the variables can have any finite domain. This problem is a generalization of the problem \"given a sum-of-products of terms, find the value of the sum\" studied by R.E. Stearns and H.B. Hunt III (1996). A data structure called a \"structure tree\" is defined which displays information about \"subproblems\" that can be solved independently during the process of evaluating the formula. Some formulas have \"good\" structure trees which enable certain generic algorithms to evaluate the formulas in significantly less time than by brute force evaluation. By \"generic algorithm,\" we mean an algorithm constructed from uninterpreted function symbols, quantifier symbols, and monoid operations. The algebraic nature of the model facilitates a formal treatment of \"local reductions\" based on the \"local replacement\" of terms. Such local reductions \"preserve formula structure\" in the sense that structure trees with nice properties transform into structure trees with similar properties. These local reductions can also be used to transform hierarchical specified problems with useful structure into hierarchically specified problems having similar structure", "keyphrases": ["quantified formulas", "structure exploitation", "commutative monoid", "data structure", "structure tree", "satisfiability problems", "constraint satisfaction problems", "dynamic programming", "computational complexity", "generic algorithms", "function symbols", "quantifier symbols", "monoid operations", "hierarchically specified problems"], "prmu": ["P", "R", "P", "P", "P", "R", "M", "U", "M", "P", "P", "P", "P", "P"]} +{"id": "1746", "title": "The exact solution of coupled thermoelectroelastic behavior of piezoelectric laminates", "abstract": "Exact solutions for static analysis of thermoelectroelastic laminated plates are presented. In this analysis, a new concise procedure for the analytical solution of composite laminated plates with piezoelectric layers is developed. A simple eigenvalue formula in real number form is directly developed from the basic coupled piezoelectric differential equations and the difficulty of treating imaginary eigenvalues is avoided. The solution is defined in the trigonometric series and can be applied to thin and thick plates. Numerical studies are conducted on a five-layer piezoelectric plate and the complexity of stresses and deformations under combined loading is illustrated. The results could be used as a benchmark for assessing any numerical solution by approximate approaches such as the finite element method while also providing useful physical insight into the behavior of piezoelectric plates in a thermal environment", "keyphrases": ["exact solution", "coupled thermoelectroelastic behavior", "piezoelectric laminates", "thermoelectroelastic laminated plates", "analytical solution", "composite laminated plates", "piezoelectric layers", "eigenvalue formula", "real number form", "coupled piezoelectric differential equations", "trigonometric series", "thin plates", "thick plates", "five-layer piezoelectric plate", "numerical study", "stresses", "deformations", "combined loading", "finite element method", "thermal environment"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1703", "title": "Statistical analysis of nonlinearly reconstructed near-infrared tomographic images. II. Experimental interpretation", "abstract": "For pt. I see ibid., vol. 21, no. 7, p. 755-63 (2002). Image error analysis of a diffuse near-infrared tomography (NIR) system has been carried out on simulated data using a statistical approach described in pt. I of this paper (Pogue et al., 2002). The methodology is used here with experimental data acquired on phantoms with a prototype imaging system intended for characterizing breast tissue. Results show that imaging performance is not limited by random measurement error, but rather by calibration issues. The image error over the entire field of view is generally not minimized when an accurate homogeneous estimate of the phantom properties is available; however, local image error over a target region of interest (ROI) is reduced. The image reconstruction process which includes a Levenberg-Marquardt style regularization provides good minimization of the objective function, yet its reduction is not always correlated with an overall image error decrease. Minimization of the bias in an ROI which contains localized changes in the optical properties can be achieved through five to nine iterations of the algorithm. Precalibration of the algorithm through statistical evaluation of phantom studies may provide a better measure of the image accuracy than that implied by minimization of the standard objective function", "keyphrases": ["medical diagnostic imaging", "nonlinearly reconstructed near-infrared tomographic images", "image error", "algorithm precalibration", "hemoglobin", "random measurement error", "target region of interest", "accurate homogeneous estimate", "phantom properties", "Levenberg-Marquardt style regularization", "bias minimization", "algorithm iterations", "objective function minimization"], "prmu": ["M", "P", "P", "R", "U", "P", "P", "P", "P", "P", "R", "R", "R"]} +{"id": "1890", "title": "Robustness of trajectories with finite time extent", "abstract": "The problem of estimating perturbation bounds of finite trajectories is considered. The trajectory is assumed to be generated by a linear system with uncertainty characterized in terms of integral quadratic constraints. It is shown that such perturbation bounds can be obtained as the solution to a nonconvex quadratic optimization problem, which can be addressed using Lagrange relaxation. The result can be used in robustness analysis of hybrid systems and switched dynamical systems", "keyphrases": ["trajectories robustness", "finite time extent", "perturbation bounds", "linear system", "uncertainty", "integral quadratic constraints", "nonconvex quadratic optimization problem", "Lagrange relaxation", "robustness analysis", "hybrid systems", "switched dynamical systems"], "prmu": ["R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1911", "title": "Pulmonary perfusion patterns and pulmonary arterial pressure", "abstract": "Uses artificial intelligence methods to determine whether quantitative parameters describing the perfusion image can be synthesized to make a reasonable estimate of the pulmonary arterial (PA) pressure measured at angiography. Radionuclide perfusion images were obtained in 120 patients with normal chest radiographs who also underwent angiographic PA pressure measurement within 3 days of the radionuclide study. An artificial neural network (ANN) was constructed from several image parameters describing statistical and boundary characteristics of the perfusion images. With use of a leave-one-out cross-validation technique, this method was used to predict the PA systolic pressure in cases on which the ANN had not been trained. A Pearson correlation coefficient was determined between the predicted and measured PA systolic pressures. ANN predictions correlated with measured pulmonary systolic pressures (r=0.846, P<.001). The accuracy of the predictions was not influenced by the presence of pulmonary embolism. None of the 51 patients with predicted PA pressures of less than 29 mm Hg had pulmonary hypertension at angiography. All 13 patients with predicted PA pressures greater than 48 mm Hg had pulmonary hypertension at angiography. Meaningful information regarding PA pressure can be derived from noninvasive radionuclide perfusion scanning. The use of image analysis in concert with artificial intelligence methods helps to reveal physiologic information not readily apparent at visual image inspection", "keyphrases": ["pulmonary perfusion patterns", "angiographic pulmonary arterial pressure measurement", "artificial neural network predictions", "accuracy", "pulmonary embolism", "pulmonary hypertension", "noninvasive radionuclide perfusion scanning", "image analysis", "physiologic information", "visual image inspection", "image parameters", "statistical characteristics", "boundary characteristics", "leave-one-out cross-validation technique", "pulmonary arterial systolic pressure", "Pearson correlation coefficient", "artificial intelligence methods", "quantitative parameters", "perfusion image", "angiography", "radionuclide perfusion images", "patients", "normal chest radiographs", "29 Pa", "48 Pa"], "prmu": ["P", "R", "R", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R"]} +{"id": "1583", "title": "Cutting through the confusion [workflow & content management]", "abstract": "Information management vendors are rushing to re-position themselves and put a portal spin on their products, says ITNET's Graham Urquhart. The result is confusion, with a range of different definitions and claims clouding the true picture", "keyphrases": ["ITNET", "portals", "collaboratively", "workflow"], "prmu": ["P", "P", "U", "P"]} +{"id": "1682", "title": "Data mining efforts increase business productivity and efficiency", "abstract": "The use and acquisition of information is a key part of the way any business makes money. Data mining technologies provide greater insight into how this information can be better used and more effectively acquired. Steven Kudyba, an expert in the field of data mining technologies, shares his expertise in an interview", "keyphrases": ["data mining", "productivity", "efficiency"], "prmu": ["P", "P", "P"]} +{"id": "1463", "title": "Computational complexity of probabilistic disambiguation", "abstract": "Recent models of natural language processing employ statistical reasoning for dealing with the ambiguity of formal grammars. In this approach, statistics, concerning the various linguistic phenomena of interest, are gathered from actual linguistic data and used to estimate the probabilities of the various entities that are generated by a given grammar, e.g., derivations, parse-trees and sentences. The extension of grammars with probabilities makes it possible to state ambiguity resolution as a constrained optimization formula, which aims at maximizing the probability of some entity that the grammar generates given the input (e.g., maximum probability parse-tree given some input sentence). The implementation of these optimization formulae in efficient algorithms, however, does not always proceed smoothly. In this paper, we address the computational complexity of ambiguity resolution under various kinds of probabilistic models. We provide proofs that some, frequently occurring problems of ambiguity resolution are NP-complete. These problems are encountered in various applications, e.g., language understanding for textand speech-based applications. Assuming the common model of computation, this result implies that, for many existing probabilistic models it is not possible to devise tractable algorithms for solving these optimization problems", "keyphrases": ["natural language processing", "statistical reasoning", "formal grammars", "statistics", "computational complexity", "probabilistic disambiguation", "NP-completeness results", "parsing problems", "speech processing", "state ambiguity resolution", "constrained optimization formula", "probabilistic models", "language understanding"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "M", "M", "P", "P", "P", "P"]} +{"id": "1762", "title": "Laguerre pseudospectral method for nonlinear partial differential equations", "abstract": "The Laguerre Gauss-Radau interpolation is investigated. Some approximation results are obtained. As an example, the Laguerre pseudospectral scheme is constructed for the BBM equation. The stability and the convergence of proposed scheme are proved. The numerical results show the high accuracy of this approach", "keyphrases": ["Laguerre pseudospectral method", "nonlinear partial differential equations", "Laguerre Gauss-Radau interpolation", "approximation results", "BBM equation", "stability", "numerical results", "nonlinear differential equations"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "1727", "title": "Linguistic knowledge and new technologies", "abstract": "Modern language studies are characterized by a variety of forms, ways, and methods of their development. In this connection, it is necessary to specify the problem of the development of their internal differentiation and classification, which lead to the formation of specific areas knowledge. An example of such an area is speechology-a field of science belonging to fundamental, theoretical, and applied linguistics", "keyphrases": ["modern language studies", "internal differentiation", "internal classification", "speechology", "applied linguistics", "theoretical linguistics", "fundamental linguistics", "linguistic knowledge"], "prmu": ["P", "P", "R", "U", "P", "R", "R", "P"]} +{"id": "1849", "title": "An active functionality service for e-business applications", "abstract": "Service based architectures are a powerful approach to meet the fast evolution of business rules and the corresponding software. An active functionality service that detects events and involves the appropriate business rules is a critical component of such a service-based middleware architecture. In this paper we present an active functionality service that is capable of detecting events in heterogeneous environments, it uses an integral ontology-based approach for the semantic interpretation of heterogeneous events and data, and provides notifications through a publish/subscribe notification mechanism. The power of this approach is illustrated with the help of an auction application and through the personalization of car and driver portals in Internet-enabled vehicles", "keyphrases": ["active functionality service", "e-business applications", "business rules", "software", "event detection", "service-based middleware architecture", "heterogeneous environments", "ontology based approach", "semantic interpretation", "publish/subscribe notification mechanism", "auction application", "personalized car portals", "personalized driver portals", "Internet-enabled vehicles"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "M", "P", "P", "P", "R", "R", "P"]} +{"id": "1831", "title": "Fast broadcasting and gossiping in radio networks", "abstract": "We establish an O(n log/sup 2/ n) upper bound on the time for deterministic distributed broadcasting in multi-hop radio networks with unknown topology. This nearly matches the known lower bound of Omega (n log n). The fastest previously known algorithm for this problem works in time O(n/sup 3/2/). Using our broadcasting algorithm, we develop an O(n/sup 3/2/ log/sup 2/ n) algorithm for gossiping in the same network model", "keyphrases": ["fast broadcasting", "upper bound", "deterministic distributed broadcasting", "gossiping", "radio networks"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1874", "title": "E - a brainiac theorem prover", "abstract": "We describe the superposition-based theorem prover E. E is a sound and complete prover for clausal first order logic with equality. Important properties of the prover include strong redundancy elimination criteria, the DISCOUNT loop proof procedure, a very flexible interface for specifying search control heuristics, and an efficient inference engine. We also discuss the strengths and weaknesses of the system", "keyphrases": ["brainiac theorem prover", "CASC", "superposition-based theorem prover", "E automatic theorem prover", "rewriting", "completeness", "soundness", "clausal first order logic", "equality", "strong redundancy elimination criteria", "DISCOUNT", "CADE ATP System Competitions", "loop proof procedure", "search control heuristics", "inference engine"], "prmu": ["P", "U", "P", "M", "U", "P", "P", "P", "P", "P", "P", "M", "P", "P", "P"]} +{"id": "1889", "title": "Sliding mode dynamics in continuous feedback control for distributed discrete-event scheduling", "abstract": "A continuous feedback control approach for real-time scheduling of discrete events is presented motivated by the need for control theoretic techniques to analyze and design such systems in distributed manufacturing applications. These continuous feedback control systems exhibit highly nonlinear and discontinuous dynamics. Specifically, when the production demand in the manufacturing system exceeds the available resource capacity then the control system \"chatters\" and exhibits sliding modes. This sliding mode behavior is advantageously used in the scheduling application by allowing the system to visit different schedules within an infinitesimal region near the sliding surface. In the paper, an analytical model is developed to characterize the sliding mode dynamics. This model is then used to design controllers in the sliding mode domain to improve the effectiveness of the control system to \"search\" for schedules with good performance. Computational results indicate that the continuous feedback control approach can provide near-optimal schedules and that it is computationally efficient compared to existing scheduling techniques", "keyphrases": ["sliding mode dynamics", "continuous feedback control", "distributed discrete-event scheduling", "real-time scheduling", "control theoretic techniques", "distributed manufacturing applications", "highly nonlinear discontinuous dynamics", "production demand", "resource capacity"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "153", "title": "On the relationship between omega -automata and temporal logic normal forms", "abstract": "We consider the relationship between omega -automata and a specific logical formulation based on a normal form for temporal logic formulae. While this normal form was developed for use with execution and clausal resolution in temporal logics, we show how it can represent, syntactically, omega -automata in a high-level way. Technical proofs of the correctness of this representation are given", "keyphrases": ["omega -automata", "temporal logic normal forms", "logical formulation", "clausal resolution", "program correctness"], "prmu": ["P", "P", "P", "P", "M"]} +{"id": "1506", "title": "Intelligent control of life support for space missions", "abstract": "Future manned space operations will include a greater use of automation than we currently see. For example, semiautonomous robots and software agents will perform difficult tasks while operating unattended most of the time. As these automated agents become more prevalent, human contact with them will occur more often and become more routine, so designing these automated agents according to the principles of human-centered computing is important. We describe two cases of semiautonomous control software developed and fielded in test environments at the NASA Johnson Space Center. This software operated continuously at the JSC and interacted closely with humans for months at a time", "keyphrases": ["life support", "software agents", "semiautonomous robots", "space missions", "intelligent control", "manned space operations", "automation", "automated agents", "semiautonomous control software", "NASA Johnson Space Center", "crew air regeneration", "crew water recovery", "human intervention"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U", "M"]} +{"id": "1543", "title": "RISCy business. Part 1: RISC projects by Cornell students", "abstract": "The author looks at several projects that Cornell University students entered in the Atmel Design 2001 contest. Those covered include a vertical plotter; BiLines, an electronic game; a wireless Internet pager; Cooking Coach; Barbie's zip drive; and a model train controller", "keyphrases": ["Atmel's Design Logic 2001 contest", "RISC projects", "Cornell students", "vertical plotter", "BiLines", "electronic game", "wireless Internet pager", "Cooking Coach", "Barbie's zip drive", "model train controller"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1607", "title": "A solvable queueing network model for railway networks and its validation and applications for the Netherlands", "abstract": "The performance of new railway networks cannot be measured or simulated, as no detailed train schedules are available. Railway infrastructure and capacities are to be determined long before the actual traffic is known. This paper therefore proposes a solvable queueing network model to compute performance measures of interest without requiring train schedules (timetables). Closed form expressions for mean delays are obtained. New network designs, traffic scenarios, and capacity expansions can so be evaluated. A comparison with real delay data for the Netherlands supports the practical value of the model. A special Dutch cargo-line application is included", "keyphrases": ["railway networks", "solvable queueing network model", "Netherlands", "railway infrastructure", "railway capacities", "performance measures", "closed form expressions", "mean delays", "network designs", "traffic scenarios", "capacity expansions", "Dutch cargo-line application"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1642", "title": "Development and validation of user-adaptive navigation and information retrieval tools for an intranet portal organizational memory information system", "abstract": "Based on previous research and properties of organizational memory, a conceptual model for navigation and retrieval functions in an intranet portal organizational memory information system was proposed, and two human-centred features (memory structure map and history-based tool) were developed to support user's navigation and retrieval in a well-known organizational memory. To test two hypotheses concerning the validity of the conceptual model and two human-centred features, an experiment was conducted with 30 subjects. Testing of the two hypotheses indicated the following: (1) the memory structure map's users showed 29% better performance in navigation, and (2) the history-based tool's users outperformed by 34% in identifying information. The results of the study suggest that a conceptual model and two human-centred features could be used in a user-adaptive interface design to improve user's performance in an intranet portal organizational memory information system", "keyphrases": ["user-adaptive navigation", "information retrieval tools", "intranet portal", "organizational memory information system", "conceptual model", "human factors", "memory structure map", "history-based tool", "experiment", "user-adaptive interface design", "user performance"], "prmu": ["P", "P", "P", "P", "P", "U", "P", "P", "P", "P", "R"]} +{"id": "1766", "title": "A note on vector cascade algorithm", "abstract": "The focus of this paper is on the relationship between accuracy of multivariate refinable vector and vector cascade algorithm. We show that, if the vector cascade algorithm (1.5) with isotropic dilation converges to a vector-valued function with regularity, then the initial function must satisfy the Strang-Fix conditions", "keyphrases": ["vector cascade algorithm", "multivariate refinable vector", "matrix algebra", "isotropic dilation", "vector-valued function", "Strang-fix conditions"], "prmu": ["P", "P", "U", "P", "P", "P"]} +{"id": "1723", "title": "Positive productivity, better billing [health care]", "abstract": "Workflow software provides the right communication solution for hospital specialists, and delivers an unexpected financial boost too", "keyphrases": ["health care", "San Francisco General Hospital", "ProVation MD", "workflow software"], "prmu": ["P", "M", "U", "P"]} +{"id": "1808", "title": "Nonlinearities in NARX polynomial models: representation and estimation", "abstract": "It is shown how nonlinearities are mapped in NARX polynomial models. General expressions are derived for the gain and eigenvalue functions in terms of the regressors and coefficients of NARX models. Such relationships are useful in grey-box identification problems. The results are illustrated using simulated and real data", "keyphrases": ["NARX polynomial model nonlinearities", "nonlinearity representation", "nonlinearity estimation", "gain functions", "eigenvalue functions", "regressors", "grey-box identification problems", "nonlinear autoregressive exogenous-input polynomial model"], "prmu": ["R", "R", "R", "R", "P", "P", "P", "M"]} +{"id": "1467", "title": "Utilizing Web-based case studies for cutting-edge information services issues", "abstract": "This article reports on a pilot study conducted by the Academic Libraries of the 21st Century project team to determine whether the benefits of the case study method as a training framework for change initiatives could successfully transfer from the traditional face-to-face format to a virtual format. Methods of developing the training framework, as well as the benefits, challenges, and recommendations for future strategies gained from participant feedback are outlined. The results of a survey administered to chat session registrants are presented in three sections: (1) evaluation of the training framework; (2) evaluation of participants' experiences in the virtual environment; and (3) a comparison of participants' preference of format. The overall participant feedback regarding the utilization of the case study method in a virtual environment for professional development and collaborative problem solving is very positive", "keyphrases": ["Web-based case studies", "cutting-edge information services", "academic libraries", "training", "change initiatives", "survey", "virtual environment", "professional development", "Internet", "collaborative problem solving"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "U", "P"]} +{"id": "1686", "title": "Internet infrastructure and the emerging information society: an appraisal of the Internet backbone industry", "abstract": "This paper examines the real constraints to the expansion of all encumbering and all pervasive information technology in our contemporary society. Perhaps the U.S. Internet infrastructure is the most appropriate to examine since it is U.S. technology that has led the world into the Internet age. In this context, this paper reviews the state of the U.S. Internet backbone that will lead us into information society of the future by facilitating massive data transmission", "keyphrases": ["Internet infrastructure", "Internet service providers", "users", "backbone companies", "local telephone companies"], "prmu": ["P", "M", "U", "M", "U"]} +{"id": "1915", "title": "Multichannel scaler for general statistical analysis of dynamic light scattering", "abstract": "A four channel scaler for counting applications has been designed and built using a standard high transfer rate parallel computer interface bus parallel data card. The counter section is based on standard complex programmable logic device integrated circuits. With a 200 MHz Pentium based host PC a sustained counting and data transfer with channel widths as short as 200 ns for a single channel is realized. The use of the multichannel scaler is demonstrated in dynamic light scattering experiments. The recorded traces are analyzed with wavelet and other statistical techniques to obtain transient changes in the properties of the scattered light", "keyphrases": ["multichannel scaler", "general statistical analysis", "dynamic light scattering", "correlation spectroscopy", "optical spectroscopic techniques", "photon signal statistical properties", "four channel scaler", "standard high transfer rate parallel computer interface", "interface bus parallel data card", "complex programmable logic device", "standard CPLD ICs", "Pentium based host PC", "windowed Fourier transform", "200 MHz", "200 ns"], "prmu": ["P", "P", "P", "U", "M", "M", "P", "P", "P", "P", "M", "P", "U", "P", "P"]} +{"id": "1603", "title": "Exploiting structure in adaptive dynamic programming algorithms for a stochastic batch service problem", "abstract": "The purpose of this paper is to illustrate the importance of using structural results in dynamic programming algorithms. We consider the problem of approximating optimal strategies for the batch service of customers at a service station. Customers stochastically arrive at the station and wait to be served, incurring a waiting cost and a service cost. Service of customers is performed in groups of a fixed service capacity. We investigate the structure of cost functions and establish some theoretical results including monotonicity of the value functions. Then, we use our adaptive dynamic programming monotone algorithm that uses structure to preserve monotonicity of the estimates at each iterations to approximate the value functions. Since the problem with homogeneous customers can be solved optimally, we have a means of comparison to evaluate our heuristic. Finally, we compare our algorithm to classical forward dynamic programming methods", "keyphrases": ["stochastic batch service problem", "adaptive dynamic programming algorithms", "structural results", "optimal strategy approximation", "service station", "waiting cost", "service cost", "fixed service capacity", "cost function structure", "value function monotonicity", "inventory theory"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "R", "R", "U"]} +{"id": "1646", "title": "The limits of shape constancy: point-to-point mapping of perspective projections of flat figures", "abstract": "The present experiments investigate point-to-point mapping of perspective transformations of 2D outline figures under diverse viewing conditions: binocular free viewing, monocular perspective with 2D cues masked by an optic tunnel, and stereoptic viewing through an optic tunnel. The first experiment involved upright figures, and served to determine baseline point-to-point mapping accuracy, which was found to be very good. Three shapes were used: square, circle and irregularly round. The main experiment, with slanted figures, involved only two shapes-square and irregularly shaped-showed at several slant degrees. Despite the accumulated evidence for shape constancy when the outline of perspective projections is considered, metric perception of the inner structure of such projections was quite limited. Systematic distortions were found, especially with more extreme slants, and attributed to the joint effect of several factors: anchors, 3D information, and slant underestimation. Contradictory flatness cues did not detract from performance, while stereoptic information improved it", "keyphrases": ["shape constancy", "point-to-point mapping", "flat figure perspective projections", "experiments", "2D outline figures", "diverse viewing conditions", "binocular free viewing", "monocular perspective", "2D cues", "optic tunnel", "stereoptic viewing", "3D shape perception", "human factors", "3D information displays", "anchors", "3D information", "slant underestimation"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "M", "P", "P", "P"]} +{"id": "1928", "title": "Solution of a Euclidean combinatorial optimization problem by the dynamic-programming method", "abstract": "A class of Euclidean combinatorial optimization problems is selected that can be solved by the dynamic programming method. The problem of allocation of servicing enterprises is solved as an example", "keyphrases": ["Euclidean combinatorial optimization problem", "dynamic programming method"], "prmu": ["P", "P"]} +{"id": "157", "title": "Automatic extraction of eye and mouth fields from a face image using eigenfeatures and ensemble networks", "abstract": "This paper presents a novel algorithm for the extraction of the eye and mouth (facial features) fields from 2D gray level images. Eigenfeatures are derived from the eigenvalues and eigenvectors of the binary edge data set constructed from eye and mouth fields. Such eigenfeatures are ideal features for finely locating fields efficiently. The eigenfeatures are extracted from a set of the positive and negative training samples for facial features and are used to train a multilayer perceptron (MLP) whose output indicates the degree to which a particular image window contains the eyes or the mouth within itself. An ensemble network consisting of a multitude of independent MLPs was used to enhance the generalization performance of a single MLP. It was experimentally verified that the proposed algorithm is robust against facial size and even slight variations of the pose", "keyphrases": ["eye field extraction", "mouth field extraction", "face feature extraction", "2D gray level images", "eigenvalues", "eigenvectors", "binary edge data set", "training samples", "multilayer perceptron", "generalization", "experiment", "eigenfeatures", "ensemble neural networks"], "prmu": ["R", "R", "R", "P", "P", "P", "P", "P", "P", "P", "U", "P", "M"]} +{"id": "1502", "title": "Mining open answers in questionnaire data", "abstract": "Surveys are important tools for marketing and for managing customer relationships; the answers to open-ended questions, in particular, often contain valuable information and provide an important basis for business decisions. The summaries that human analysts make of these open answers, however, tend to rely too much on intuition and so aren't satisfactorily reliable. Moreover, because the Web makes it so easy to take surveys and solicit comments, companies are finding themselves inundated with data from questionnaires and other sources. Handling it all manually would be not only cumbersome but also costly. Thus, devising a computer system that can automatically mine useful information from open answers has become an important issue. We have developed a survey analysis system that works on these principles. The system mines open answers through two statistical learning techniques: rule learning (which we call rule analysis) and correspondence analysis", "keyphrases": ["natural language response analysis", "survey analysis", "text mining system", "questionnaire data", "statistical learning techniques", "rule analysis", "correspondence analysis", "open answer mining"], "prmu": ["M", "P", "M", "P", "P", "P", "P", "R"]} +{"id": "1547", "title": "New projection-type methods for monotone LCP with finite termination", "abstract": "In this paper we establish two new projection-type methods for the solution of the monotone linear complementarity problem (LCP). The methods are a combination of the extragradient method and the Newton method, in which the active set strategy is used and only one linear system of equations with lower dimension is solved at each iteration. It is shown that under the assumption of monotonicity, these two methods are globally and linearly convergent. Furthermore, under a nondegeneracy condition they have a finite termination property. Finally, the methods are extended to solving the monotone affine variational inequality problem", "keyphrases": ["projection-type methods", "monotone LCP", "finite termination", "monotone linear complementarity problem", "extragradient method", "Newton method", "active set strategy", "linear system of equations", "iteration", "monotonicity", "convergence", "nondegeneracy condition", "monotone affine variational inequality problem", "matrix", "vector"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U"]} +{"id": "1835", "title": "Establishing an urban digital cadastre: analytical reconstruction of parcel boundaries", "abstract": "A new method for generating a spatially accurate, legally supportive and operationally efficient cadastral database of the urban cadastral reality is described. The definition and compilation of an accurate cadastral database (achieving a standard deviation smaller than 0.1 m) is based on an analytical reconstruction of cadastral boundaries rather than on the conventional field reconstruction process. The new method is based on GPS control points and traverse networks for providing the framework; the old field books for defining the links between the various original ground features; and a geometrical and cadastral adjustment process as the conceptual basis. A pilot project that was carried out in order to examine and evaluate the new method is described", "keyphrases": ["urban digital cadastre", "analytical reconstruction", "parcel boundaries", "spatially accurate cadastral database", "urban cadastral reality", "standard deviation", "field reconstruction process", "GPS control points", "traverse networks", "old field books", "ground features", "cadastral adjustment process", "land information systems", "LIS", "geographic information systems"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U", "U"]} +{"id": "1870", "title": "Robust control of nonlinear systems with parametric uncertainty", "abstract": "Probabilistic robustness analysis and synthesis for nonlinear systems with uncertain parameters are presented. Monte Carlo simulation is used to estimate the likelihood of system instability and violation of performance requirements subject to variations of the probabilistic system parameters. Stochastic robust control synthesis searches the controller design parameter space to minimize a cost that is a function of the probabilities that design criteria will not be satisfied. The robust control design approach is illustrated by a simple nonlinear example. A modified feedback linearization control is chosen as controller structure, and the design parameters are searched by a genetic algorithm to achieve the tradeoff between stability and performance robustness", "keyphrases": ["robust control", "nonlinear systems", "parametric uncertainty", "probabilistic robustness analysis", "probabilistic robustness synthesis", "uncertain parameters", "Monte Carlo simulation", "system instability", "performance requirements violation", "stochastic control synthesis", "modified feedback linearization control", "genetic algorithm", "input-to-state stability"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "R", "R", "P", "P", "M"]} +{"id": "173", "title": "Stock market trading rule discovery using technical charting heuristics", "abstract": "In this case study in knowledge engineering and data mining, we implement a recognizer for two variations of the 'bull flag' technical charting heuristic and use this recognizer to discover trading rules on the NYSE Composite Index. Out-of-sample results indicate that these rules are effective", "keyphrases": ["stock market trading", "rule discovery", "technical charting heuristics", "financial expert system", "case study", "knowledge engineering", "data mining", "NYSE Composite Index", "out-of-sample results"], "prmu": ["P", "P", "P", "U", "P", "P", "P", "P", "P"]} +{"id": "1526", "title": "GK-DEVS: Geometric and kinematic DEVS formalism for simulation modeling of 3-dimensional multi-component systems", "abstract": "A combined discrete/continuous simulation methodology based on the DEVS (discrete event system specification) formalism is presented in this paper that satisfies the simulation requirements of 3-dimensional and dynamic systems with multi-components. We propose a geometric and kinematic DEVS (GK-DEVS) formalism that is able to describe the geometric and kinematic structure of a system and its continuous state dynamics as well as the interaction among the multi-components. To establish one model having dynamic behavior and a particular hierarchical structure, the atomic and the coupled model of the conventional DEVS are merged into one model in the proposed formalism. For simulation of the continuous motion of 3-D components, the sequential state set is partitioned into the discrete and the continuous state set and the rate of change function over the continuous state set is employed. Although modified from the conventional DEVS formalism, the GK-DEVS formalism preserves a hierarchical, modular modeling fashion and a coupling scheme. Furthermore, for the GK-DEVS model simulation, we propose an abstract simulation algorithm, called a GK-Simulator, in which data and control are separated and events are scheduled not globally but hierarchically so that an object-oriented principle is satisfied. The proposed GK-DEVS formalism and the GK-Simulator algorithm have been applied to the simulation of a flexible manufacturing system consisting of a 2-axis lathe, a 3-axis milling machine, and a vehicle-mounted robot", "keyphrases": ["GK-DEVS", "kinematic DEVS", "geometric DEVS", "simulation modeling", "3 dimensional multi-component systems", "combined discrete/continuous simulation methodology", "simulation requirements", "continuous state dynamics", "dynamic behavior", "continuous motion", "sequential state set", "abstract simulation algorithm", "GK-Simulator", "object-oriented principle", "flexible manufacturing system", "2-axis lathe", "3-axis milling machine", "vehicle-mounted robot"], "prmu": ["P", "P", "R", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1563", "title": "A distance between elliptical distributions based in an embedding into the Siegel group", "abstract": "This paper describes two different embeddings of the manifolds corresponding to many elliptical probability distributions with the informative geometry into the manifold of positive-definite matrices with the Siegel metric, generalizing a result published previously elsewhere. These new general embeddings are applicable to a wide class of elliptical probability distributions, in which the normal, t-Student and Cauchy are specific examples. A lower bound for the Rao distance is obtained, which is itself a distance, and, through these embeddings, a number of statistical tests of hypothesis are derived", "keyphrases": ["elliptical distributions", "Siegel group", "manifolds embeddings", "informative geometry", "positive-definite matrices", "elliptical probability distributions", "lower bound"], "prmu": ["P", "P", "R", "P", "P", "P", "P"]} +{"id": "1627", "title": "Blind identification of non-stationary MA systems", "abstract": "A new adaptive algorithm for blind identification of time-varying MA channels is derived. This algorithm proposes the use of a novel system of equations derived by combining the third- and fourth-order statistics of the output signals of MA models. This overdetermined system of equations has the important property that it can be solved adaptively because of their symmetries via an overdetermined recursive instrumental variable-type algorithm. This algorithm shows good behaviour in arbitrary noisy environments and good performance in tracking time-varying systems", "keyphrases": ["blind identification", "time-varying channels", "nonstationary systems", "adaptive algorithm", "fourth-order statistics", "third-order statistics", "MA models", "overdetermined recursive algorithm", "recursive instrumental variable algorithm", "arbitrary noisy environments", "tracking", "iterative algorithms", "additive Gaussian noise", "higher-order statistics"], "prmu": ["P", "R", "M", "P", "P", "M", "P", "R", "M", "P", "P", "M", "U", "M"]} +{"id": "1811", "title": "Adaptive tracking controller design for robotic systems using Gaussian wavelet networks", "abstract": "An adaptive tracking control design for robotic systems using Gaussian wavelet networks is proposed. A Gaussian wavelet network with accurate approximation capability is employed to approximate the unknown dynamics of robotic systems by using an adaptive learning algorithm that can learn the parameters of the dilation and translation of Gaussian wavelet functions. Depending on the finite number of wavelet basis functions which result in inevitable approximation errors, a robust control law is provided to guarantee the stability of the closed-loop robotic system that can be proved by Lyapunov theory. Finally, the effectiveness of the Gaussian wavelet network-based control approach is illustrated through comparative simulations on a six-link robot manipulator", "keyphrases": ["adaptive tracking controller design", "robotic systems", "Gaussian wavelet networks", "accurate approximation capability", "unknown dynamics", "adaptive learning algorithm", "approximation errors", "robust control law", "closed-loop system", "Lyapunov theory", "six-link robot manipulator"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1483", "title": "Hypothesis-based concept assignment in software maintenance", "abstract": "Software maintenance accounts for a significant proportion of the lifetime cost of a software system. Software comprehension is required in many parts of the maintenance process and is one of the most expensive activities. Many tools have been developed to help the maintainer reduce the time and cost of this task, but of the numerous tools and methods available one group has received relatively little attention: those using plausible reasoning to address the concept assignment problem. We present a concept assignment method for COBOL II: hypothesis-based concept assignment (HB-CA). An implementation of a prototype tool is described, and results from a comprehensive evaluation using commercial COBOL II sources are summarised. In particular, we identify areas of a standard maintenance process where such methods would be appropriate, and discuss the potential cost savings that may result", "keyphrases": ["hypothesis-based concept assignment", "software maintenance", "lifetime cost", "COBOL II", "scalability"], "prmu": ["P", "P", "P", "P", "U"]} +{"id": "1854", "title": "Software Technology: looking for quality accountants", "abstract": "Software Technology wants to turn 23 years of reselling experience in the legal business into an asset in the accounting market", "keyphrases": ["Software Technology", "reselling", "accounting market"], "prmu": ["P", "P", "P"]} +{"id": "1782", "title": "Exploring the sabbatical or other leave as a means of energizing a career", "abstract": "This article challenges librarians to create leaves that will not only inspire professional growth but also renewal. It presents a framework for developing a successful leave, incorporating useful advice from librarians at Concordia University (Montreal). As food for thought, the article offers examples of specific options meant to encourage professionals to explore their own creative ideas. Finally, a central theme of this article is that a midlife leave provides one with the perfect opportunity to take stock of oneself in order to define future career directions. Midlife is a time when rebel forces, feisty protestors from within, often insist on being heard. It is a time, in other words, when professionals often long to break loose from the stress \"to do far more, in less time\" (Barner, 1994). Escaping from current job constraints into a world of creative endeavor, when well-executed, is a superb means of invigorating a career stuck in gear and discovering a fresh perspective from which to view one's profession. To ignite renewal, midcareer is the perfect time to grant one's imagination free reign", "keyphrases": ["sabbatical leave", "career", "librarians", "professional growth", "library staff", "midlife leave"], "prmu": ["R", "P", "P", "P", "U", "P"]} +{"id": "1894", "title": "Switching controller design via convex polyhedral Lyapunov functions", "abstract": "We propose a systematic switching control design method for a class of nonlinear discrete time hybrid systems. The novelty of the adopted approach is in the fact that unlike conventional control the control burden is shifted to a logical level thus creating the need for the development of new analysis/design methods", "keyphrases": ["switching controller design", "convex polyhedral Lyapunov functions", "nonlinear discrete time hybrid systems", "systematic design method"], "prmu": ["P", "P", "P", "R"]} +{"id": "1869", "title": "Stability and L/sub 2/ gain properties of LPV systems", "abstract": "Stability and L/sub 2/ gain properties of linear parameter-varying systems are obtained under assumed bounds on either the maximum or average value of the parameter rate", "keyphrases": ["stability", "L/sub 2/ gain properties", "linear parameter-varying systems", "parameter rate", "Gromwall-Bellman inequality", "gain scheduled control"], "prmu": ["P", "P", "P", "P", "U", "M"]} +{"id": "1742", "title": "A sufficient condition for optimality in nondifferentiable invex programming", "abstract": "A sufficient optimality condition is established for a nonlinear programming problem without differentiability assumption on the data wherein Clarke's (1975) generalized gradient is used to define invexity", "keyphrases": ["nondifferentiable invex programming", "sufficient optimality condition", "nonlinear programming problem", "generalized gradient", "invexity", "locally Lipschitz function", "semiconvex function"], "prmu": ["P", "P", "P", "P", "P", "U", "U"]} +{"id": "1707", "title": "Tactical airborne reconnaissance goes dual-band and beyond", "abstract": "Multispectral imaging technologies are satisfying the need for a \"persistent\" look at the battlefield. We highlight the need to persistently monitor a battlefield to determine exactly who and what is there. For example, infrared imaging can be used to expose the fuel status of an aircraft on the runway. A daytime, visible-spectrum image of the same aircraft would offer information about external details, such as the plane's markings and paint scheme. A dual-band camera enables precision image registration by fusion and frequently yields more information than is possible by evaluating the images separately", "keyphrases": ["tactical airborne reconnaissance", "multispectral imaging technologies", "battlefield", "infrared imaging", "fuel status", "aircraft", "daytime visible-spectrum image", "dual-band camera", "precision image registration", "sensor fusion"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P", "M"]} +{"id": "1740", "title": "Verification of ideological classifications-a statistical approach", "abstract": "The paper presents a statistical method of verifying ideological classifications of votes. Parliamentary votes, preclassified by an expert (on a chosen subset), are verified at an assumed significance level by seeking the most likely match with the actual vote results. Classifications that do not meet the requirements defined are rejected. The results obtained can be applied in the ideological dimensioning algorithms, enabling ideological identification of dimensions obtained", "keyphrases": ["ideological classifications", "statistical approach", "parliamentary votes", "significance level", "ideological dimensioning algorithms", "ideological space", "bootstrap"], "prmu": ["P", "P", "P", "P", "P", "M", "U"]} +{"id": "1705", "title": "The use of visual search for knowledge gathering in image decision support", "abstract": "This paper presents a new method of knowledge gathering for decision support in image understanding based on information extracted from the dynamics of saccadic eye movements. The framework involves the construction of a generic image feature extraction library, from which the feature extractors that are most relevant to the visual assessment by domain experts are determined automatically through factor analysis. The dynamics of the visual search are analyzed by using the Markov model for providing training information to novices on how and where to look for image features. The validity of the framework has been evaluated in a clinical scenario whereby the pulmonary vascular distribution on Computed Tomography images was assessed by experienced radiologists as a potential indicator of heart failure. The performance of the system has been demonstrated by training four novices to follow the visual assessment behavior of two experienced observers. In all cases, the accuracy of the students improved from near random decision making (33%) to accuracies ranging from 50% to 68%", "keyphrases": ["pulmonary vascular distribution", "experienced radiologists", "heart failure indicator", "visual assessment behavior", "experienced observers", "student accuracy", "Markov model", "training information", "image features", "domain experts", "saccadic eye movements dynamics", "near random decision making", "medical diagnostic imaging"], "prmu": ["P", "P", "R", "P", "P", "R", "P", "P", "P", "P", "R", "P", "M"]} +{"id": "1896", "title": "The dynamics of a railway freight wagon wheelset with dry friction damping", "abstract": "We investigate the dynamics of a simple model of a wheelset that supports one end of a railway freight wagon by springs with linear characteristics and dry friction dampers. The wagon runs on an ideal, straight and level track with constant speed. The lateral dynamics in dependence on the speed is examined. We have included stick-slip and hysteresis in our model of the dry friction and assume that Coulomb's law holds during the slip phase. It is found that the action of dry friction completely changes the bifurcation diagram, and that the longitudinal component of the dry friction damping forces destabilizes the wagon", "keyphrases": ["dynamics", "railway freight wagon wheelset", "dry friction damping", "linear characteristics", "lateral dynamics", "stick-slip", "hysteresis", "Coulomb law", "bifurcation diagram", "longitudinal component"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1519", "title": "Structural invariance of spatial Pythagorean hodographs", "abstract": "The structural invariance of the four-polynomial characterization for three-dimensional Pythagorean hodographs introduced by Dietz et al. (1993), under arbitrary spatial rotations, is demonstrated. The proof relies on a factored-quaternion representation for Pythagorean hodographs in three-dimensional Euclidean space-a particular instance of the \"PH representation map\" proposed by Choi et al. (2002)-and the unit quaternion description of spatial rotations. This approach furnishes a remarkably simple derivation for the polynomials u(t), upsilon (t), p(t), q(t) that specify the canonical form of a rotated Pythagorean hodograph, in terms of the original polynomials u(t), upsilon (t), p(t), q(t) and the angle theta and axis n of the spatial rotation. The preservation of the canonical form of PH space curves under arbitrary spatial rotations is essential to their incorporation into computer-aided design and manufacturing applications, such as the contour machining of free-form surfaces using a ball-end mill and realtime PH curve CNC interpolators", "keyphrases": ["structural invariance", "four-polynomial characterization", "spatial Pythagorean hodographs", "3D Pythagorean hodographs", "arbitrary spatial rotations", "factored quaternion representation", "3D Euclidean space", "PH representation map", "unit quaternion description", "spatial rotations", "CAD/CAM", "contour machining", "free-form surfaces", "ball-end mill", "real-time PH curve CNC interpolators"], "prmu": ["P", "P", "P", "M", "P", "M", "M", "P", "P", "P", "U", "P", "P", "P", "M"]} +{"id": "1618", "title": "Optimal learning for patterns classification in RBF networks", "abstract": "The proposed modifying of the structure of the radial basis function (RBF) network by introducing the weight matrix to the input layer (in contrast to the direct connection of the input to the hidden layer of a conventional RBF) so that the training space in the RBF network is adaptively separated by the resultant decision boundaries and class regions is reported. The training of this weight matrix is carried out as for a single-layer perceptron together with the clustering process. In this way the network is capable of dealing with complicated problems, which have a high degree of interference in the training data, and achieves a higher classification rate over the current classifiers using RBF", "keyphrases": ["pattern classification", "optimal learning", "RBF networks", "radial basis function network", "weight matrix training", "input layer", "training space", "decision boundaries", "class regions", "single-layer perceptron", "clustering process", "classification rate improvement"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1625", "title": "Use of fuzzy weighted autocorrelation function for pitch extraction from noisy speech", "abstract": "An investigation is presented into the feasibility of incorporating a fuzzy weighting scheme into the calculation of an autocorrelation function for pitch extraction. Simulation results reveal that the proposed method provides better robustness against background noise than the conventional approaches for extracting pitch period in a noisy environment", "keyphrases": ["pitch extraction", "noisy speech", "fuzzy weighting scheme", "autocorrelation function", "simulation results", "background noise", "speech analysis-synthesis system", "average magnitude difference function", "cepstrum method"], "prmu": ["P", "P", "P", "P", "P", "P", "M", "M", "M"]} +{"id": "1660", "title": "A regularized conjugate gradient method for symmetric positive definite system of linear equations", "abstract": "A class of regularized conjugate gradient methods is presented for solving the large sparse system of linear equations of which the coefficient matrix is an ill-conditioned symmetric positive definite matrix. The convergence properties of these methods are discussed in depth, and the best possible choices of the parameters involved in the new methods are investigated in detail. Numerical computations show that the new methods are more efficient and robust than both classical relaxation methods and classical conjugate direction methods", "keyphrases": ["regularized conjugate gradient method", "symmetric positive definite system", "linear equations", "large sparse system", "coefficient matrix", "convergence properties", "classical relaxation methods", "classical conjugate direction methods", "ill-conditioned linear system"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "171", "title": "Education, training and development policies and practices in medium-sized companies in the UK: do they really influence firm performance?", "abstract": "This paper sets out to examine the relationship between training and firm performance in middle-sized UK companies. It recognises that there is evidence that \"high performance work practices\" appear to be associated with better performance in large US companies, but argues that this relationship is less likely to be present in middle-sized companies. The paper's key contribution is to justify the wider concept of education, training and development (ETD) as applicable to such companies. It then finds that clusters of some ETD variables do appear to be associated with better middle-sized company performance", "keyphrases": ["medium-sized UK companies", "training", "firm performance", "education", "development policies", "high performance work practices", "ETD variable clusters", "human resources"], "prmu": ["R", "P", "P", "P", "P", "P", "R", "U"]} +{"id": "1524", "title": "Organizational design, information transfer, and the acquisition of rent-producing resources", "abstract": "Within the resource-based view of the firm, a dynamic story has emerged in which the knowledge accumulated over the history of a firm and embedded in organizational routines and structures influences the firm's ability to recognize the value of new resources and capabilities. This paper explores the possibility of firms to select organizational designs that increase the likelihood that they will recognize and value rent-producing resources and capabilities. A computational model is developed to study the tension between an organization's desire to explore its environment for new capabilities and the organization's need to exploit existing capabilities. Support is provided for the proposition that integration, both externally and internally, is an important source of dynamic capability. The model provides greater insight into the tradeoffs between these two forms of integration and suggests when one form may be preferred over another. In particular, evidence is provided that in uncertain environments, the ability to explore possible alternatives is critical while in more certain environments, the ability to transfer information internally is paramount", "keyphrases": ["organizational design", "information transfer", "rent-producing resources", "computational model", "uncertain environments", "probability", "certain environments", "social networks", "business strategy", "investments"], "prmu": ["P", "P", "P", "P", "P", "U", "P", "U", "U", "U"]} +{"id": "1561", "title": "Self-validating integration and approximation of piecewise analytic functions", "abstract": "Let an analytic or a piecewise analytic function on a compact interval be given. We present algorithms that produce enclosures for the integral or the function itself. Under certain conditions on the representation of the function, this is done with the minimal order of numbers of operations. The integration algorithm is implemented and numerical comparisons to non-validating integration software are presented", "keyphrases": ["self-validating integration", "self-validating approximation", "compact interval", "enclosures", "minimal order", "integration algorithm", "complex interval arithmetic", "piecewise analytic functions"], "prmu": ["P", "R", "P", "P", "P", "P", "M", "P"]} +{"id": "1780", "title": "Migrating to public librarianship: depart on time to ensure a smooth flight", "abstract": "Career change can be a difficult, time-consuming, and anxiety-laden process for anyone contemplating this important decision. The challenges faced by librarians considering the move from academic to public librarianship can be equally and significantly demanding. To most outsiders, at least on the surface, it may appear to be a quick and easy transition to make, but some professional librarians recognize the distinct differences between these areas of librarianship. Although the ubiquitous nature of technology has brought the various work responsibilities of academic and public librarians closer together during the last decade, there remain key differences in job-related duties and the work environments. These dissimilarities pose meaningful hurdles to leap for academic librarians wishing to migrate to the public sector. The paper considers the variations between academic and public librarianship", "keyphrases": ["public librarianship", "career change", "academic library", "public library", "professional librarians", "library technology", "work responsibilities", "job-related duties", "work environments"], "prmu": ["P", "P", "M", "M", "P", "M", "P", "P", "P"]} +{"id": "1738", "title": "Nurture the geek in you [accounting on the Internet]", "abstract": "When chartered accountants focus on IT, it's not simply because we think technology is neat. We keep on top of tech trends and issues because it helps us do our jobs well. We need to know how to best manage and implement the wealth of technology systems within out client base or employer, as well as to determine on an ongoing basis how evolving technologies might affect business strategies, threats and opportunities. One way to stay current with technology is by monitoring the online drumbeat. Imagine the Internet as an endless conversation of millions of chattering voices, each focusing on a multitude of topics and issues. It's not surprising that a great deal of the information relates to technology itself, and if you learn how to tune in to the drumbeat, you can keep yourself informed", "keyphrases": ["chartered accountants", "Internet", "information technology", "Slashdot", "Techdirt", "The Register", "Dan Gillmor's Wournal", "Daypop Top 40", "RISKS", "SecurityFocus", "TechWeb"], "prmu": ["P", "P", "R", "U", "U", "M", "M", "M", "U", "U", "U"]} +{"id": "1813", "title": "LMI approach to digital redesign of linear time-invariant systems", "abstract": "A simple design methodology for the digital redesign of static state feedback controllers by using linear matrix inequalities is presented. The proposed method provides close matching of the states between the original continuous-time system and those of the digitally redesigned system with a guaranteed stability. Specifically, the digital redesign problem is reformulated as linear matrix inequalities (LMIs) and solved by a numerical optimisation technique. The main feature of the proposed method is that the closed-loop stability of the digitally redesigned system is explicitly guaranteed within the design procedure using the LMI-based approach. A numerical example of the position control of a simple crane system is presented", "keyphrases": ["LMI approach", "digital redesign", "linear time-invariant systems", "design methodology", "linear matrix inequalities", "continuous-time system", "guaranteed stability", "numerical optimisation technique", "closed-loop stability", "position control", "crane system"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1481", "title": "Impact of aviation highway-in-the-sky displays on pilot situation awareness", "abstract": "Thirty-six pilots (31 men, 5 women) were tested in a flight simulator on their ability to intercept a pathway depicted on a highway-in-the-sky (HITS) display. While intercepting and flying the pathway, pilots were required to watch for traffic outside the cockpit. Additionally, pilots were tested on their awareness of speed, altitude, and heading during the flight. Results indicated that the presence of a flight guidance cue significantly improved flight path awareness while intercepting the pathway, but significant practice effects suggest that a guidance cue might be unnecessary if pilots are given proper training. The amount of time spent looking outside the cockpit while using the HITS display was significantly less than when using conventional aircraft instruments. Additionally, awareness of flight information present on the HITS display was poor. Actual or potential applications of this research include guidance for the development of perspective flight display standards and as a basis for flight training requirements", "keyphrases": ["flight simulator", "pilots", "highway-in-the-sky display", "cockpit", "flight guidance", "human factors", "situation awareness", "flight path awareness", "aircraft display"], "prmu": ["P", "P", "P", "P", "P", "U", "P", "P", "R"]} +{"id": "1856", "title": "Tax forms: CD or not CD?", "abstract": "The move from CD to the Web looks unstoppable. Besides counting how many thousands of electronic tax forms they offer, vendors are rapidly moving those documents to the Web", "keyphrases": ["electronic tax forms", "Web", "ATX Forms Zillion Forms", "CCH Perform Plus H", "Kleinrock Forms Library Plus", "Nelco LaserLibrarian II", "RIA eForm", "STF Services Superform", "Universal Tax Systems Forms Complete"], "prmu": ["P", "P", "M", "U", "M", "U", "U", "U", "M"]} +{"id": "155", "title": "Fuzzy non-homogeneous Markov systems", "abstract": "In this paper the theory of fuzzy logic and fuzzy reasoning is combined with the theory of Markov systems and the concept of a fuzzy non-homogeneous Markov system is introduced for the first time. This is an effort to deal with the uncertainty introduced in the estimation of the transition probabilities and the input probabilities in Markov systems. The asymptotic behaviour of the fuzzy Markov system and its asymptotic variability is considered and given in closed analytic form. Moreover, the asymptotically attainable structures of the system are estimated also in a closed analytic form under some realistic assumptions. The importance of this result lies in the fact that in most cases the traditional methods for estimating the probabilities can not be used due to lack of data and measurement errors. The introduction of fuzzy logic into Markov systems represents a powerful tool for taking advantage of the symbolic knowledge that the experts of the systems possess", "keyphrases": ["fuzzy nonhomogeneous Markov systems", "fuzzy logic", "fuzzy reasoning", "uncertainty", "transition probabilities", "input probabilities", "asymptotic variability", "measurement errors", "symbolic knowledge", "probability theory"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "1500", "title": "DAML+OIL: an ontology language for the Semantic Web", "abstract": "By all measures, the Web is enormous and growing at a staggering rate, which has made it increasingly difficult-and important-for both people and programs to have quick and accurate access to Web information and services. The Semantic Web offers a solution, capturing and exploiting the meaning of terms to transform the Web from a platform that focuses on presenting information, to a platform that focuses on understanding and reasoning with information. To support Semantic Web development, the US Defense Advanced Research Projects Agency launched the DARPA Agent Markup Language (DAML) initiative to fund research in languages, tools, infrastructure, and applications that make Web content more accessible and understandable. Although the US government funds DAML, several organizations-including US and European businesses and universities, and international consortia such as the World Wide Web Consortium-have contributed to work on issues related to DAML's development and deployment. We focus on DAML's current markup language, DAML+OIL, which is a proposed starting point for the W3C's Semantic Web Activity's Ontology Web Language (OWL). We introduce DAML+OIL syntax and usage through a set of examples, drawn from a wine knowledge base used to teach novices how to build ontologies", "keyphrases": ["Semantic Web", "DARPA Agent Markup Language", "DAML+OIL", "Ontology Web Language", "syntax", "wine knowledge base"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1545", "title": "Pontryagin maximum principle of optimal control governed by fluid dynamic systems with two point boundary state constraint", "abstract": "We study the optimal control problem subject to the semilinear equation with a state constraint. We prove certain theorems and give examples of state constraints so that the maximum principle holds. The main difficulty of the problem is to make the sensitivity analysis of the state with respect to the control caused by the unboundedness and nonlinearity of an operator", "keyphrases": ["Pontryagin maximum principle", "optimal control", "fluid dynamics", "semilinear equation", "state constraints"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1601", "title": "Solving the multiple competitive facilities location problem", "abstract": "In this paper we propose five heuristic procedures for the solution of the multiple competitive facilities location problem. A franchise of several facilities is to be located in a trade area where competing facilities already exist. The objective is to maximize the market share captured by the franchise as a whole. We perform extensive computational tests and conclude that a two-step heuristic procedure combining simulated annealing and an ascent algorithm provides the best solutions", "keyphrases": ["multiple competitive facilities location problem", "heuristic procedures", "facilities franchise", "market share maximization", "computational tests", "two-step heuristic procedure", "simulated annealing", "ascent algorithm"], "prmu": ["P", "P", "R", "R", "P", "P", "P", "P"]} +{"id": "1644", "title": "An experimental evaluation of comprehensibility aspects of knowledge structures derived through induction techniques: a case study of industrial fault diagnosis", "abstract": "Machine induction has been extensively used in order to develop knowledge bases for decision support systems and predictive systems. The extent to which developers and domain experts can comprehend these knowledge structures and gain useful insights into the basis of decision making has become a challenging research issue. This article examines the knowledge structures generated by the C4.5 induction technique in a fault diagnostic task and proposes to use a model of human learning in order to guide the process of making comprehensive the results of machine induction. The model of learning is used to generate hierarchical representations of diagnostic knowledge by adjusting the level of abstraction and varying the goal structures between 'shallow' and 'deep' ones. Comprehensibility is assessed in a global way in an experimental comparison where subjects are required to acquire the knowledge structures and transfer to new tasks. This method of addressing the issue of comprehensibility appears promising especially for machine induction techniques that are rather inflexible with regard to the number and sorts of interventions allowed to system developers", "keyphrases": ["experimental evaluation", "knowledge structure comprehensibility aspects", "induction techniques", "case study", "industrial fault diagnosis", "knowledge bases", "decision support systems", "predictive systems", "C4.5 induction technique", "industrial plants", "human learning model", "diagnostic knowledge representations"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "M", "R", "R"]} +{"id": "1837", "title": "A review of methodologies used in research on cadastral development", "abstract": "World-wide, much attention has been given to cadastral development. As a consequence of experiences made during recent decades, several authors have stated the need for research in the domain of cadastre and proposed methodologies to be used. The paper contributes to the acceptance of research methodologies needed for cadastral development, and thereby enhances theory in the cadastral domain. The paper reviews nine publications on cadastre and identifies the methodologies used. The review focuses on the institutional, social, political and economic aspects of cadastral development, rather than on the technical aspects. The main conclusion is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped by social, political and economic conditions, as well as technology. Since the geodetic survey profession has been the keeper of the cadastre, geodetic surveyors will have to deal ever more with social science matters, a fact that universities will have to consider", "keyphrases": ["cadastral development methodologies", "cadastre", "research methodologies", "political aspects", "economic aspects", "social sciences", "economic conditions", "geodetic survey profession", "geodetic surveyors", "land registration", "case study"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "P", "P", "M", "U"]} +{"id": "1872", "title": "TPTP, CASC and the development of a semantically guided theorem prover", "abstract": "The first-order theorem prover SCOTT has been through a series of versions over some ten years. The successive provers, while retaining the same underlying technology, have used radically different algorithms and shown wide differences of behaviour. The development process has depended heavily on experiments with problems from the TPTP library and has been sharpened by participation in CASC each year since 1997. We outline some of the difficulties inherent in designing and refining a theorem prover as complex as SCOTT, and explain our experimental methodology. While SCOTT is not one of the systems which have been highly optimised for CASC, it does help to illustrate the influence of both CASC and the TPTP library on contemporary theorem proving research", "keyphrases": ["TPTP library", "Semantically Constrained Otter", "proof searches", "CASC", "semantically guided theorem prover", "first-order theorem prover", "SCOTT", "experimental methodology"], "prmu": ["P", "M", "U", "P", "P", "P", "P", "P"]} +{"id": "1759", "title": "On the p-adic Birch, Swinnerton-Dyer Conjecture for non-semistable reduction", "abstract": "In this paper, we examine the Iwasawa theory of elliptic curves E with additive reduction at an odd prime p. By extending Perrin-Riou's theory to certain nonsemistable representations, we are able to convert Kato's zeta-elements into p-adic L-functions. This allows us to deduce the cotorsion of the Selmer group over the cyclotomic Z/sub p/-extension of Q, and thus prove an inequality in the p-adic Birch and Swinnerton-Dyer conjecture at primes p whose square divides the conductor of E", "keyphrases": ["p-adic Birch", "Swinnerton-Dyer conjecture", "nonsemistable reduction", "lwasawa theory", "elliptic curves", "additive reduction", "Perrin-Riou's theory", "p-adic L-functions", "cotorsion", "Selmer group", "cyclotomic Z/sub p/-extension"], "prmu": ["P", "P", "R", "M", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1799", "title": "Steady-state mean-square error analysis of the cross-correlation and constant modulus algorithm in a MIMO convolutive system", "abstract": "The cross-correlation and constant modulus algorithm (CC-CMA) has been proven to be an effective approach in the problem of joint blind equalisation and source separation in a multi-input and multi-output system. In the paper, the steady-state mean-square error performance of CC-CMA in a noise-free environment is studied, and a new expression is derived based on the energy preservation approach of Mai and Sayed (2000). Simulation studies are undertaken to support the analysis", "keyphrases": ["MIMO convolutive system", "Steady-state mean-square error analysis", "cross-correlation", "constant modulus algorithm", "joint blind equalisation", "source separation", "multi-input multi-output system", "noise-free environment", "energy preservation approach", "CC-CMA"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "1465", "title": "P systems with symport/antiport rules: the traces of objects", "abstract": "We continue the study of those P systems where the computation is performed by the communication of objects, that is, systems with symport and antiport rules. Instead of the (number of) objects collected in a specified membrane, as the result of a computation we consider the itineraries of a certain object through membranes, during a halting computation, written as a coding of the string of labels of the visited membranes. The family of languages generated in this way is investigated with respect to its place in the Chomsky hierarchy. When the (symport and antiport) rules are applied in a conditional manner, promoted or inhibited by certain objects which should be present in the membrane where a rule is applied, then a characterization of recursively enumerable languages is obtained; the power of systems with the rules applied freely is only partially described", "keyphrases": ["P systems", "object communication", "object traces", "antiport rules", "symport rules", "itineraries", "halting computation", "label string coding", "languages", "Chomsky hierarchy", "recursively enumerable languages"], "prmu": ["P", "R", "R", "P", "R", "P", "P", "R", "P", "P", "P"]} +{"id": "1498", "title": "John McCarthy: father of AI", "abstract": "If John McCarthy, the father of AI, were to coin a new phrase for \"artificial intelligence\" today, he would probably use \"computational intelligence.\" McCarthy is not just the father of AI, he is also the inventor of the Lisp (list processing) language. The author considers McCarthy's conception of Lisp and discusses McCarthy's recent research that involves elaboration tolerance, creativity by machines, free will of machines, and some improved ways of doing situation calculus", "keyphrases": ["John McCarthy", "father of AI", "artificial intelligence", "computational intelligence", "Lisp", "list processing", "elaboration tolerance", "creativity", "free will", "situation calculus"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1764", "title": "Two-scale curved element method for elliptic problems with small periodic coefficients", "abstract": "This paper is concerned with the second order elliptic problems with small periodic coefficients on a bounded domain with a curved boundary. A two-scale curved element method which couples linear elements and isoparametric elements is proposed. The error estimate is obtained over the given smooth domain. Furthermore an additive Schwarz method is provided for the isoparametric element method", "keyphrases": ["two-scale curved element method", "elliptic problems", "small periodic coefficients", "second order elliptic problems", "bounded domain", "curved boundary", "linear elements", "isoparametric elements", "error estimate", "additive Schwarz method", "isoparametric element method"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1721", "title": "Dueling platforms [healthcare network servers]", "abstract": "Many large hospitals and healthcare systems have grown accustomed to the reliability of mainframe architecture, although tighter operating budgets, coupled with advances in client/server technology, have led to more office and clinical applications being moved off mainframes. But Evanston Northwestern Healthcare wasn't ready to get rid of its IBM OS 390 mainframe just yet. While a number of new clinical applications are being installed on two brand new IBM servers, Evanston Northwestern Healthcare will retain its favored hospital billing system and let it reside on the organization's mainframe, as it has since 1982", "keyphrases": ["network servers", "Evanston Northwestern Healthcare", "IBM OS 390 mainframe", "Leapfrog Group", "computerized physician order entry system"], "prmu": ["P", "P", "P", "U", "M"]} +{"id": "1917", "title": "Design and modeling of an interval-based ABR flow control protocol", "abstract": "A novel flow control protocol is presented for availability bit rate (ABR) service in asynchronous transfer mode (ATM) networks. This scheme features periodic explicit rate feedback that enables precise allocation of link bandwidth and buffer space on a hop-by-hop basis to guarantee maximum throughput, minimum cell loss, and high resource efficiency. With the inclusion of resource management cell synchronization and consolidation algorithms, this protocol is capable of controlling point-to-multipoint ABR services within a unified framework. The authors illustrate the modeling of single ABR connection, the interaction between multiple ABR connections, and the constraints applicable to flow control decisions. A loss-free flow control mechanism is presented for high-speed ABR connections using a fluid traffic model. Supporting algorithms and ATM signaling procedures are specified, in company with linear system modeling, numerical analysis, and simulation results, which demonstrate its performance and cost benefits in high-speed backbone networking scenarios", "keyphrases": ["interval-based ABR flow control protocol", "modeling", "design", "availability bit rate service", "ATM networks", "periodic explicit rate feedback", "link bandwidth allocation", "buffer space allocation", "maximum throughput", "minimum cell loss", "high resource efficiency", "resource management cell synchronization algorithms", "resource management cell consolidation algorithms", "point-to-multipoint services", "flow control decisions", "loss-free flow control mechanism", "high-speed ABR connections", "fluid traffic model", "signaling", "linear system modeling", "numerical analysis", "simulation", "high-speed backbone networking scenarios"], "prmu": ["P", "P", "P", "R", "R", "P", "R", "R", "P", "P", "P", "R", "R", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1679", "title": "Project scheduling under time dependent costs-a branch and bound algorithm", "abstract": "In a given project network, execution of each activity in normal duration requires utilization of certain resources. If faster execution of an activity is desired then additional resources at extra cost would be required. Given a project network, the cost structure for each activity and a planning horizon, the project compression problem is concerned with the determination of optimal schedule of performing each activity while satisfying given restrictions and minimizing the total cost of project execution. The paper considers the project compression problem with time dependent cost structure for each activity. The planning horizon is divided into several regular time intervals over which the cost structure of an activity may vary. But the cost structure of the activities remains the same within a time interval. The objective is to find an optimal project schedule minimizing the total project cost. We present a mathematical model for this problem, develop some heuristics and an exact branch and bound algorithm. Using simulated problems we provide an insight into the computational performances of heuristics and the branch and bound algorithm", "keyphrases": ["project scheduling", "time dependent costs", "branch and bound algorithm", "project network", "planning horizon", "project compression problem", "optimal schedule", "heuristics"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1684", "title": "E-learning on the college campus: a help or hindrance to students learning objectives: a case study", "abstract": "If you know how to surf the World Wide Web, have used email before, and can learn how to send an email attachment, then learning how to interact in an online course should not be difficult at all. In a way to find out, I decided to offer two identical courses, one of which would be offered online and the other the \"traditional way\". I wanted to see how students would fare with identical material provided in each course. I wanted their anonymous feedback, when the course was over", "keyphrases": ["distance education", "William Paterson University", "e-learning"], "prmu": ["U", "U", "P"]} +{"id": "168", "title": "Nurturing clients' trust to encourage engagement success during the customization of ERP systems", "abstract": "Customization is a crucial, lengthy, and costly aspect in the successful implementation of ERP systems, and has, accordingly, become a major specialty of many vendors and consulting companies. The study examines how such companies can increase their clients' perception of engagement success through increased client trust that is brought about through responsive and dependable customization. Survey data from ERP customization clients show that, as hypothesized, clients' trust influenced their perception of engagement success with the company. The data also show that clients' trust in the customization company was increased when the company behaved in accordance with client expectations by being responsive, and decreased when the company behaved in a manner that contradicted these expectations by not being dependable. Responses to an open-ended question addendum attached to the survey corroborated the importance of responsiveness and dependability. Implications for customization companies and research on trust are discussed", "keyphrases": ["client trust", "engagement success", "customization", "ERP systems", "enterprise resource planning systems", "vendors", "consulting companies", "perceived responsiveness", "MRP II implementation", "integrity", "benevolence", "dependability"], "prmu": ["P", "P", "P", "P", "M", "P", "P", "M", "M", "U", "U", "P"]} +{"id": "1578", "title": "Records role in e-business", "abstract": "Records management standards are now playing a key role in e-business strategy", "keyphrases": ["e-business strategy", "records management"], "prmu": ["P", "P"]} +{"id": "1829", "title": "Improved approximation of Max-Cut on graphs of bounded degree", "abstract": "Let alpha approximately=0.87856 denote the best approximation ratio currently known for the Max-Cut problem on general graphs. We consider a semidefinite relaxation of the Max-Cut problem, round it using the random hyperplane rounding technique of M.X. Goemans and D.P. Williamson (1995), and then add a local improvement step. We show that for graphs of degree at most Delta , our algorithm achieves an approximation ratio of at least alpha + epsilon , where epsilon >0 is a constant that depends only on Delta .. Using computer assisted analysis, we show that for graphs of maximal degree 3 our algorithm obtains an approximation ratio of at least 0.921, and for 3-regular graphs the approximation ratio is at least 0.924. We note that for the semidefinite relaxation of Max-Cut used by Goemans and Williamson the integrality gap is at least 1/0.885, even for 2-regular graphs", "keyphrases": ["Max-Cut approximation", "semidefinite relaxation", "approximation ratio", "computer assisted analysis", "2-regular graphs", "bounded degree graph", "best approximation ratio"], "prmu": ["R", "P", "P", "P", "P", "R", "P"]} +{"id": "1702", "title": "Reconstruction of time-varying 3-D left-ventricular shape from multiview X-ray cineangiocardiograms", "abstract": "This paper reports on the clinical application of a system for recovering the time-varying three-dimensional (3-D) left-ventricular (LV) shape from multiview X-ray cineangiocardiograms. Considering that X-ray cineangiocardiography is still commonly employed in clinical cardiology and computational costs for 3-D recovery and visualization are rapidly decreasing, it is meaningful to develop a clinically applicable system for 3-D LV shape recovery from X-ray cineangiocardiograms. The system is based on a previously reported closed-surface method of shape recovery from two-dimensional occluding contours with multiple views. To apply the method to \"real\" LV cineangiocardiograms, user-interactive systems were implemented for preprocessing, including detection of LV contours, calibration of the imaging geometry, and setting of the LV model coordinate system. The results for three real LV angiographic image sequences are presented, two with fixed multiple views (using supplementary angiography) and one with rotating views. 3-D reconstructions utilizing different numbers of views were compared and evaluated in terms of contours manually traced by an experienced radiologist. The performance of the preprocesses was also evaluated, and the effects of variations in user-specified parameters on the final 3-D reconstruction results were shown to be sufficiently small. These experimental results demonstrate the potential usefulness of combining multiple views for 3-D recovery from \"real\" LV cineangiocardiograms", "keyphrases": ["medical diagnostic imaging", "time-varying 3-D left-ventricular shape reconstruction", "multiview X-ray cineangiocardiograms", "clinical cardiology", "two-dimensional occluding contours", "arterial septal defect", "B-spline", "computational costs", "user-interactive systems", "angiographic image sequences", "fixed multiple views", "experienced radiologist", "user-specified parameters variations"], "prmu": ["M", "R", "P", "P", "P", "U", "U", "P", "P", "P", "P", "P", "R"]} +{"id": "1747", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. II. Laminate theory and applications", "abstract": "For pt. I see ibid., pp. 1159-76. The two papers report systematically a constitutive description for the inelastic and strength behavior of laminated composites reinforced with various fiber preforms. The constitutive relationship is established micromechanically, through layer-by-layer analysis. Namely, only the properties of the constituent fiber and matrix materials of the composites are required as input data. In the previous part lamina theory was presented. Three fundamental quantities of the laminae, i.e. the internal stresses generated in the constituent fiber and matrix materials and the instantaneous compliance matrix, with different fiber preform (including woven, braided, and knitted fabric) reinforcements were explicitly obtained by virtue of the bridging micromechanics model. In this paper, the laminate stress analysis is shown. The purpose of this analysis is to determine the load shared by each lamina in the laminate, so that the lamina theory can be applied. Incorporation of the constitutive equations into an FEM software package is illustrated. A number of application examples are given to demonstrate the efficiency of the constitutive theory. The predictions made include: failure envelopes of multidirectional laminates subjected to biaxial in-plane loads, thermomechanical cycling stress-strain curves of a titanium metal matrix composite laminate, S-N curves of multilayer knitted fabric reinforced laminates under tensile fatigue, and bending load-deflection plots and ultimate bending strengths of laminated braided fabric reinforced beams subjected to lateral loads", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "laminate theory", "strength behavior", "composites", "fiber preforms", "micromechanics", "layer-by-layer analysis", "internal stresses", "matrix materials", "instantaneous compliance matrix", "stress analysis", "load", "FEM software package", "failure envelopes", "multidirectional laminates", "biaxial in-plane loads", "thermomechanical cycling stress-strain curves", "titanium metal matrix composite laminate", "S-N curves", "multilayer knitted fabric reinforced laminates", "tensile fatigue", "bending load deflection plots", "ultimate bending strengths", "laminated braided fabric reinforced beams", "lateral loads"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "P", "P", "P"]} +{"id": "1891", "title": "On trajectory and force tracking control of constrained mobile manipulators with parameter uncertainty", "abstract": "Studies the trajectory and force tracking control problem of mobile manipulators subject to holonomic and nonholonomic constraints with unknown inertia parameters. Adaptive controllers are proposed based on a suitable reduced dynamic model, the defined reference signals and the mixed tracking errors. The proposed controllers not only ensure the entire state of the system to asymptotically converge to the desired trajectory but also ensure the constraint force to asymptotically converge to the desired force. A detailed numerical example is presented to illustrate the developed methods", "keyphrases": ["trajectory control", "force tracking control", "constrained mobile manipulators", "parameter uncertainty", "holonomic constraints", "nonholonomic constraints", "adaptive controllers", "reduced dynamic model", "mixed tracking errors", "asymptotic convergence", "position control", "mobile robots"], "prmu": ["R", "P", "P", "P", "R", "P", "P", "P", "P", "P", "M", "M"]} +{"id": "19", "title": "Decentralized adaptive output feedback stabilization for a class of interconnected systems with unknown bound of uncertainties", "abstract": "The problem of adaptive decentralized stabilization for a class of linear time-invarying large-scale systems with nonlinear interconnectivity and uncertainties is discussed. The bounds of uncertainties are assumed to be unknown. For such uncertain dynamic systems, an adaptive decentralized controller is presented. The resulting closed-loop systems are asymptotically stable in theory. Moreover, an adaptive decentralized control scheme is given. The scheme ensures the closed-loop systems exponentially practically stable and can be used in practical engineering. Finally, simulations show that the control scheme is effective", "keyphrases": ["adaptive decentralized stabilization", "closed-loop systems", "uncertain dynamic systems", "robust control", "large scale systems"], "prmu": ["P", "P", "P", "M", "M"]} +{"id": "1909", "title": "Breast MR imaging with high spectral and spatial resolutions: preliminary experience", "abstract": "The authors evaluated magnetic resonance (MR) imaging with high spectral and spatial resolutions (HSSR) of water and fat in breasts of healthy volunteers (n=6) and women with suspicious lesions (n=6). Fat suppression, edge delineation, and image texture were improved on MR images derived from HSSR data compared with those on conventional MR images. HSSR MR imaging data acquired before and after contrast medium injection showed spectrally inhomogeneous changes in the water resonances in small voxels that were not detectable with conventional MR imaging", "keyphrases": ["breast magnetic resonance imaging", "high spectral spatial resolutions", "healthy volunteers", "edge delineation", "image texture", "magnetic resonance images", "magnetic resonance imaging data", "contrast medium injection", "water resonances", "small voxels", "women", "suspicious lesions", "fat suppression"], "prmu": ["R", "R", "P", "P", "P", "R", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1667", "title": "Combining constraint programming and linear programming on an example of bus driver scheduling", "abstract": "Provides details of a successful application where the column generation algorithm was used to combine constraint programming and linear programming. In the past, constraint programming and linear programming were considered to be two competing technologies that solved similar types of problems. Both these technologies had their strengths and weaknesses. The paper shows that the two technologies can be combined together to extract the strengths of both these technologies. Details of a real-world application to optimize bus driver duties are given. This system was developed by ILOG for a major software house in Japan using ILOG-Solver and ILOG-CPLEX, constraint programming and linear programming C/C++ libraries", "keyphrases": ["constraint programming", "linear programming", "bus driver scheduling", "column generation algorithm", "ILOG", "ILOG-Solver", "ILOG-CPLEX", "C/C++ libraries"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1622", "title": "Error resilient intra refresh scheme for H.26L stream", "abstract": "Recently much attention has been focused on video streaming through IP-based networks. An error resilient RD intra macro-block refresh scheme for H.26L Internet video streaming is introduced. Various channel simulations have proved that this scheme is more effective than those currently adopted in H.26L", "keyphrases": ["H.26L video streaming", "Internet", "IP-based networks", "error resilient scheme", "intra macro-block refresh scheme", "channel simulations", "RD intra refresh scheme", "video communication", "RDerr scheme", "RDall scheme"], "prmu": ["R", "P", "P", "R", "P", "P", "R", "M", "M", "M"]} +{"id": "176", "title": "Knowledge model reuse: therapy decision through specialisation of a generic decision model", "abstract": "We present the definition of the therapy decision task and its associated Heuristic Multi-Attribute (HM) solving method, in the form of a KADS-style specification. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature constituting a global therapy-evaluation framework in which considerations such as patient preferences and quality-of-life results are integrated. We give a high-level overview of this task as a specialisation of the generic decision task, and additional decomposition methods for the subtasks involved. These subtasks possess some reflective capabilities for reasoning about self-models, particularly the learning subtask, which incrementally corrects and refines the model used to assess the effects of the therapies. This work illustrates the process of reuse in the framework of AI software development methodologies such as KADS-CommonKADS in order to obtain new (more specialised but still generic) components for the analysis libraries developed in this context. In order to maximise reuse benefits, where possible, the therapy decision task and HM method have been defined in terms of regular components from the earlier-mentioned libraries. To emphasise the importance of using a rigorous approach to the modelling of domain and method ontologies, we make extensive use of the semi-formal object-oriented analysis notation UML, together with its associated constraint language OCL, to illustrate the ontology of the decision method and the corresponding specific one of the therapy decision domain, the latter being a refinement via inheritance of the former", "keyphrases": ["knowledge model reuse", "therapy decision task", "KADS-style specification", "global therapy-evaluation framework", "patient preferences", "reasoning", "learning subtask", "software development methodologies", "CommonKADS", "ontologies", "object-oriented analysis notation", "UML", "constraint language", "OCL", "generic decision model specialisation", "Heuristic Multi-Attribute solving method"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "U", "P", "P", "P", "P", "P", "R", "R"]} +{"id": "1566", "title": "A numerical C/sup 1/-shadowing result for retarded functional differential equations", "abstract": "This paper gives a numerical C/sup 1/-shadowing between the exact solutions of a functional differential equation and its numerical approximations. The shadowing result is obtained by comparing exact solutions with numerical approximation which do not share the same initial value. Behavior of stable manifolds of functional differential equations under numerics will follow from the shadowing result", "keyphrases": ["numerical C/sup 1/-shadowing", "exact solutions", "numerical approximations", "stable manifolds", "retarded functional differential equations"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1523", "title": "Process specialization: defining specialization for state diagrams", "abstract": "A precise definition of specialization and inheritance promises to be as useful in organizational process modeling as it is in object modeling. It would help us better understand, maintain, reuse, and generate process models. However, even though object-oriented analysis and design methodologies take full advantage of the object specialization hierarchy, the process specialization hierarchy is not supported in major process representations, such as the state diagram, data flow diagram, and UML representations. Partly underlying this lack of support is an implicit assumption that we can always specialize a process by treating it as \"just another object.\" We argue in this paper that this is not so straightforward as it might seem; we argue that a process-specific approach must be developed. We propose such an approach in the form of a set of transformations which, when applied to a process description, always result in specialization. We illustrate this approach by applying it to the state diagram representation and demonstrate that this approach to process specialization is not only theoretically possible, but shows promise as a method for categorizing and analyzing processes. We point out apparent inconsistencies between our notion of process specialization and existing work on object specialization but show that these inconsistencies are superficial and that the definition we provide is compatible with the traditional notion of specialization", "keyphrases": ["process specialization", "state diagrams", "inheritance", "organizational process modeling", "object-oriented analysis", "object specialization hierarchy", "process representation", "object-oriented design"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "1787", "title": "The theory of information reversal", "abstract": "The end of the industrial age coincides with the advent of the information society as the next model of social and economic organization, which brings about significant changes in the way modern man conceives work and the social environment. The functional basis of the new model is pivoted upon the effort to formulate the theory on the violent reversal of the basic relationship between man and information, and isolate it as one of the components for the creation of the new electronic reality. The objective of the theory of reversal is to effectively contribute to the formulation of a new definition consideration in regards to the concept of the emerging information society. In order to empirically apply the theory of reversal, we examine a case study based on the example of the digital library", "keyphrases": ["information reversal theory", "information society", "industrial age", "social organization", "economic organization", "case study", "digital library", "information systems"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "M"]} +{"id": "1851", "title": "Supporting global user profiles through trusted authorities", "abstract": "Personalization generally refers to making a Web site more responsive to the unique and individual needs of each user. We argue that for personalization to work effectively, detailed and interoperable user profiles should be globally available for authorized sites, and these profiles should dynamically reflect changes in user interests. Creating user profiles from user click-stream data seems to be an effective way of generating detailed and dynamic user profiles. However, a user profile generated in this way is available only on the computer where the user accesses his browser, and is inaccessible when the same user works on a different computer. On the other hand, integration of the Internet with telecommunication networks has made it possible for the users to connect to the Web with a variety of mobile devices as well as desktops. This requires that user profiles should be available to any desktop or mobile device on the Internet that users choose to work with. In this paper, we address these problems through the concept of \"trusted authority\". A user agent at the client side that captures the user click stream, dynamically generates a navigational history 'log' file in Extensible Markup Language (XML). This log file is then used to produce 'user profiles' in a resource description framework (RDF). A user's right to privacy is provided through the Platform for Privacy Preferences (P3P) standard. User profiles are uploaded to the trusted authority and served next time the user connects to the Web", "keyphrases": ["global user profiles", "trusted authorities", "personalization", "Web site", "Internet", "telecommunication networks", "mobile device", "user agent", "user click stream", "navigational history log file", "XML", "resource description framework", "privacy", "Platform for Privacy Preferences standard", "namespace qualifier", "globally unique user ID/password identification"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "R", "U", "M"]} +{"id": "1814", "title": "Control of integral processes with dead-time. 2. Quantitative analysis", "abstract": "For part 1, see ibid., p.285-90, (2002). Several different control schemes for integral processes with dead time resulted in the same disturbance response. It has already been shown that such a response is subideal. Hence, it is necessary to quantitatively analyse the achievable specifications and the robust stability regions. The control parameter can be quantitatively determined with a compromise between the disturbance response and the robustness. Four specifications: (normalised) maximum dynamic error, maximum decay rate, (normalised) control action bound and approximate recovery time are used to characterise the step-disturbance response. It is shown that any attempt to obtain a (normalised) dynamic error less than tau /sub m/ is impossible and a sufficient condition on the (relative) gain-uncertainty bound is square root (3)/2", "keyphrases": ["integral processes", "dead-time", "quantitative analysis", "disturbance response", "robust stability regions", "robustness", "maximum dynamic error", "maximum decay rate", "control action bound", "approximate recovery time", "step-disturbance response", "sufficient condition", "gain-uncertainty bound"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1486", "title": "Hand-held digital video-camera for eye examination and follow-up", "abstract": "We developed a hand-held digital colour video-camera for eye examination in primary care. The device weighed 550 g. It featured a charge-coupled device (CCD) and corrective optics. Both colour video and digital still images could be taken. The video-camera was connected to a PC with software for database storage, image processing and telecommunication. We studied 88 normal subjects (38 male, 50 female), aged 7-62 years. It was not necessary to use mydriatic eye drops for pupillary dilation. Satisfactory digital images of the whole face and the anterior eye were obtained. The optic disc and the central part of the ocular fundus could also be recorded. Image quality of the face and the anterior eye were excellent; image quality of the optic disc and macula were good enough for tele-ophthalmology. Further studies are needed to evaluate the usefulness of the equipment in different clinical conditions", "keyphrases": ["hand-held digital colour video camera", "eye examination", "primary care", "charge-coupled device", "corrective optics", "digital still images", "colour video images", "PC", "software", "database storage", "image processing", "telecommunication", "normal subjects", "whole face", "anterior eye", "optic disc", "ocular fundus", "image quality", "tele-ophthalmology", "clinical conditions", "follow-up"], "prmu": ["M", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "152", "title": "Linear tense logics of increasing sets", "abstract": "We provide an extension of the language of linear tense logic with future and past connectives F and P, respectively, by a modality that quantifies over the points of some set which is assumed to increase in the course of time. In this way we obtain a general framework for modelling growth qualitatively. We develop an appropriate logical system, prove a corresponding completeness and decidability result and discuss the various kinds of flow of time in the new context. We also consider decreasing sets briefly", "keyphrases": ["linear tense logic", "future and past connectives", "logical system", "completeness", "decidability", "decreasing sets", "temporal reasoning"], "prmu": ["P", "P", "P", "P", "P", "P", "U"]} +{"id": "1542", "title": "The open-source HCS project", "abstract": "Despite the rumors, the HCS II project is not dead. In fact, HCS has been licensed and is now an open-source project. In this article, the author brings us up to speed on the HCS II project's past, present, and future. The HCS II is an expandable, standalone, network-based (RS-485), intelligent-node, industrial-oriented supervisory control (SC) system intended for demanding home control applications. The HCS incorporates direct and remote digital inputs and outputs, direct and remote analog inputs and outputs, real time or Boolean decision event triggering, X10 transmission and reception, infrared remote control transmission and reception, remote LCDs, and a master console. Its program is compiled on a PC with the XPRESS compiler and then downloaded to the SC where it runs independently of the PC", "keyphrases": ["HCS II", "supervisory control system", "home control", "network-based"], "prmu": ["P", "R", "P", "P"]} +{"id": "1507", "title": "Ethnography, customers, and negotiated interactions at the airport", "abstract": "In the late 1990s, tightly coordinated airline schedules unraveled owing to massive delays resulting from inclement weather, overbooked flights, and airline operational difficulties. As schedules slipped, the delayed departures and late arrivals led to systemwide breakdowns, customers missed their connections, and airline work activities fell further out of sync. In offering possible answers, we emphasize the need to consider the customer as participant, following the human-centered computing model. Our study applied ethnographic methods to understand the airline system domain and the nature of airline delays, and it revealed the deficiencies of the airline production system model of operations. The research insights that led us to shift from a production and marketing system perspective to a customer-as-participant view might appear obvious to some readers. However, we do not know of any airline that designs its operations and technologies around any other model than the production and marketing system view. Our human-centered analysis used ethnographic methods to gather information, offering new insight into airline delays and suggesting effective ways to improve operations reliability", "keyphrases": ["human-centered computing model", "customer trajectories", "airports", "employees", "ethnography", "negotiated interactions", "airline delays", "airline production system operations model", "customer-as-participant view", "operations reliability"], "prmu": ["P", "M", "P", "U", "P", "P", "P", "R", "P", "P"]} +{"id": "1643", "title": "Effectiveness of user testing and heuristic evaluation as a function of performance classification", "abstract": "For different levels of user performance, different types of information are processed and users will make different types of errors. Based on the error's immediate cause and the information being processed, usability problems can be classified into three categories. They are usability problems associated with skill-based, rule-based and knowledge-based levels of performance. In this paper, a user interface for a Web-based software program was evaluated with two usability evaluation methods, user testing and heuristic evaluation. The experiment discovered that the heuristic evaluation with human factor experts is more effective than user testing in identifying usability problems associated with skill-based and rule-based levels of performance. User testing is more effective than heuristic evaluation in finding usability problems associated with the knowledge-based level of performance. The practical application of this research is also discussed in the paper", "keyphrases": ["user testing", "heuristic evaluation", "performance classification", "user performance", "usability", "knowledge-based performance levels", "skill-based performance levels", "user interface", "Web-based software", "experiment", "human factors", "rule-based performance levels"], "prmu": ["P", "P", "P", "P", "P", "R", "R", "P", "P", "P", "P", "R"]} +{"id": "1606", "title": "Single machine earliness-tardiness scheduling with resource-dependent release dates", "abstract": "This paper deals with the single machine earliness and tardiness scheduling problem with a common due date and resource-dependent release dates. It is assumed that the cost of resource consumption of a job is a non-increasing linear function of the job release date, and this function is common for all jobs. The objective is to find a schedule and job release dates that minimize the total resource consumption, and earliness and tardiness penalties. It is shown that the problem is NP-hard in the ordinary sense even if the due date is unrestricted (the number of jobs that can be scheduled before the due date is unrestricted). An exact dynamic programming (DP) algorithm for small and medium size problems is developed. A heuristic algorithm for large-scale problems is also proposed and the results of a computational comparison between heuristic and optimal solutions are discussed", "keyphrases": ["single machine earliness-tardiness scheduling", "resource-dependent release dates", "common due date", "job resource consumption cost", "nonincreasing linear function", "job release date", "total resource consumption minimization", "NP-hard problem", "exact dynamic programming algorithm", "medium size problems", "small size problems", "heuristic algorithm", "large-scale problems", "polynomial time algorithm"], "prmu": ["P", "P", "P", "R", "M", "P", "R", "R", "R", "P", "R", "P", "P", "M"]} +{"id": "1875", "title": "The design and implementation of VAMPIRE", "abstract": "We describe VAMPIRE: a high-performance theorem prover for first-order logic. As our description is mostly targeted to the developers of such systems and specialists in automated reasoning, it focuses on the design of the system and some key implementation features. We also analyze the performance of the prover at CASC-JC", "keyphrases": ["VAMPIRE", "high-performance theorem prover", "first-order logic", "automated reasoning", "performance evaluation", "CASC-JC", "resolution theorem proving"], "prmu": ["P", "P", "P", "P", "M", "P", "M"]} +{"id": "1830", "title": "Approximation of pathwidth of outerplanar graphs", "abstract": "There exists a polynomial time algorithm to compute the pathwidth of outerplanar graphs, but the large exponent makes this algorithm impractical. In this paper, we give an algorithm that, given a biconnected outerplanar graph G, finds a path decomposition of G of pathwidth at most twice the pathwidth of G plus one. To obtain the result, several relations between the pathwidth of a biconnected outerplanar graph and its dual are established", "keyphrases": ["pathwidth approximation", "outerplanar graphs", "polynomial time algorithm", "biconnected outerplanar graph", "path decomposition"], "prmu": ["R", "P", "P", "P", "P"]} +{"id": "1888", "title": "L/sub 2/ model reduction and variance reduction", "abstract": "We examine certain variance properties of model reduction. The focus is on L/sub 2/ model reduction, but some general results are also presented. These general results can be used to analyze various other model reduction schemes. The models we study are finite impulse response (FIR) and output error (OE) models. We compare the variance of two estimated models. The first one is estimated directly from data and the other one is computed by reducing a high order model, by L/sub 2/ model reduction. In the FIR case we show that it is never better to estimate the model directly from data, compared to estimating it via L/sub 2/ model reduction of a high order FIR model. For OE models we show that the reduced model has the same variance as the directly estimated one if the reduced model class used contains the true system", "keyphrases": ["L/sub 2/ model reduction", "variance reduction", "finite impulse response models", "FIR models", "output error models", "identification"], "prmu": ["P", "P", "R", "P", "R", "U"]} +{"id": "1462", "title": "Non-linear analysis of nearly saturated porous media: theoretical and numerical formulation", "abstract": "A formulation for a porous medium saturated with a compressible fluid undergoing large elastic and plastic deformations is presented. A consistent thermodynamic formulation is proposed for the two-phase mixture problem; thus preserving a straightforward and robust numerical scheme. A novel feature is the specification of the fluid compressibility in terms of a volumetric logarithmic strain, which is energy conjugated to the fluid pressure in the entropy inequality. As a result, the entropy inequality is used to separate three different mechanisms representing the response: effective stress response according to Terzaghi in the solid skeleton, fluid pressure response to compressibility of the fluid, and dissipative Darcy flow representing the interaction between the two phases. The paper is concluded with a couple of numerical examples that display the predictive capabilities of the proposed formulation. In particular, we consider results for the kinematically linear theory as compared to the kinematically non-linear theory", "keyphrases": ["nearly saturated porous media", "nonlinear analysis", "compressible fluid", "large elastic deformations", "large plastic deformations", "consistent thermodynamic formulation", "two-phase mixture problem", "robust numerical scheme", "fluid compressibility", "volumetric logarithmic strain", "fluid pressure", "entropy inequality", "effective stress response", "solid skeleton", "fluid pressure response", "dissipative Darcy flow", "predictive capabilities", "kinematically linear theory", "kinematically nonlinear theory"], "prmu": ["P", "M", "P", "R", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1726", "title": "Two-layer model for the formation of states of the hidden Markov chains", "abstract": "Procedures for the formation of states of the hidden Markov models are described. Formant amplitudes and frequencies are used as state features. The training strategy is presented that allows one to calculate the parameters of conditional probabilities of the generation of a given formant set by a given hidden state with the help of the maximum likelihood method", "keyphrases": ["hidden Markov models", "formant amplitudes", "formant frequencies", "state features", "conditional probabilities", "hidden state", "maximum likelihood method"], "prmu": ["P", "P", "R", "P", "P", "P", "P"]} +{"id": "1763", "title": "Numerical studies of 2D free surface waves with fixed bottom", "abstract": "The motion of surface waves under the effect of bottom is a very interesting and challenging phenomenon in the nature. we use boundary integral method to compute and analyze this problem. In the linear analysis, the linearized equations have bounded error increase under some compatible conditions. This contributes to the cancellation of instable Kelvin-Helmholtz terms. Under the effect of bottom, the existence of equations is hard to determine, but given some limitations it proves true. These limitations are that the swing of interfaces should be small enough, and the distance between surface and bottom should be large enough. In order to maintain the stability of computation, some compatible relationship must be satisfied. In the numerical examples, the simulation of standing waves and breaking waves are calculated. And in the case of shallow bottom, we found that the behavior of waves are rather singular", "keyphrases": ["numerical studies", "2D free surface waves", "boundary integral method", "linear analysis", "linearized equations", "instable Kelvin-Helmholtz terms"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1848", "title": "Contracting in the days of ebusiness", "abstract": "Putting electronic business on a sound foundation-model theoretically as well as technologically-is a central challenge for research as well as commercial development. This paper concentrates on the discovery and negotiation phase of concluding an agreement based on a contract. We present a methodology for moving seamlessly from a many-to-many relationship in the discovery phase to a one-to-one relationship in the contract negotiation phase. Making the content of contracts persistent is achieved by reconstructing contract templates by means of mereologic (logic of the whole-part relation). Possibly nested sub-structures of the contract template are taken as a basis for negotiation in a dialogical way. For the negotiation itself the contract templates are extended by implications (logical) and sequences (topical)", "keyphrases": ["electronic business", "discovery phase", "contracting", "many-to-many relationship", "one-to-one relationship", "contract negotiation phase", "mereologic", "contract templates", "nested sub-structure", "sequences", "implications"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1910", "title": "Breast cancer: effectiveness of computer-aided diagnosis-observer study with independent database of mammograms", "abstract": "Evaluates the effectiveness of a computerized classification method as an aid to radiologists reviewing clinical mammograms for which the diagnoses were unknown to both the radiologists and the computer. Six mammographers and six community radiologists participated in an observer study. These 12 radiologists interpreted, with and without the computer aid, 110 cases that were unknown to both the 12 radiologist observers and the trained computer classification scheme. The radiologists' performances in differentiating between benign and malignant masses without and with the computer aid were evaluated with receiver operating characteristic (ROC) analysis. Two-tailed P values were calculated for the Student t test to indicate the statistical significance of the differences in performances with and without the computer aid. When the computer aid was used, the average performance of the 12 radiologists improved, as indicated by an increase in the area under the ROC curve (A/sub z/) from 0.93 to 0.96 (P<.001), by an increase in partial area under the ROC curve (/sub 0.9/0A'/sub z/) from 0.56 to 0.72 (P<.001), and by an increase in sensitivity from 94% to 98% (P=.022). No statistically significant difference in specificity was found between readings with and those without computer aid ( Delta +-0.014; P=.46; 95% Cl: -0.054, 0.026), where Delta is difference in specificity. When we analyzed results from the mammographers and community radiologists as separate groups, a larger improvement was demonstrated for the community radiologists. Computer-aided diagnosis can potentially help radiologists improve their diagnostic accuracy in the task of differentiating between benign and malignant masses seen on mammograms", "keyphrases": ["computerized classification method", "clinical mammograms", "observer study", "breast cancer", "computer-aided diagnosis", "independent database", "trained computer classification scheme", "radiologist observers", "benign masses", "malignant masses", "receiver operating characteristic analysis", "two-tailed P values", "Student t test", "statistical significance", "performances", "average performance", "receiver operating characteristic curve", "diagnostic accuracy", "computer aid", "mammographers", "community radiologists"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "192", "title": "New Jersey African American women writers and their publications: a study of identification from written and oral sources", "abstract": "This study examines the use of written sources, and personal interviews and informal conversations with individuals from New Jersey's religious, political, and educational community to identify African American women writers in New Jersey and their intellectual output. The focus on recognizing the community as an oral repository of history and then tapping these oral sources for collection development and acquisition purposes is supported by empirical and qualitative evidence. Findings indicate that written sources are so limited that information professionals must rely on oral sources to uncover local writers and their publications", "keyphrases": ["New Jersey African American women writers", "written sources", "personal interviews", "informal conversations", "intellectual output", "oral repository", "history", "collection development", "local writers", "special collections"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1683", "title": "Unlocking the potential of videoconferencing", "abstract": "I propose in this paper to show, through a number of case studies, that videoconferencing is user-friendly, cost-effective, time-effective and life-enhancing for people of all ages and abilities and that it requires only a creative and imaginative approach to unlock its potential. I believe that these benefits need not, and should not, be restricted to the education sector. My examples will range from simple storytelling, through accessing international experts, professional development and distance learning in a variety of forms, to the use of videoconferencing for virtual meetings and planning sessions. In some cases, extracts from the reactions and responses of the participants will be included to illustrate the impact of the medium", "keyphrases": ["videoconferencing", "benefits", "case studies", "education"], "prmu": ["P", "P", "P", "P"]} +{"id": "1724", "title": "A winning combination [wireless health care]", "abstract": "Three years ago, the Institute of Medicine (IOM) reported that medical errors result in at least 44,000 deaths each year-more than deaths from highway accidents, breast cancer or AIDS. That report, and others which placed serious errors as high as 98,000 annually, served as a wake-up call for healthcare providers such as the CareGroup Healthcare System Inc., a Boston-area healthcare network that is the second largest integrated delivery system in the northeastern United States. With annual revenues of $1.2B, CareGroup provides primary care and specialty services to more than 1,000,000 patients. CareGroup combined wireless technology with the Web to create a provider order entry (POE) system designed to reduce the frequency of costly medical mistakes. The POE infrastructure includes InterSystems Corporation's CACHE database, Dell Computer C600 laptops and Cisco Systems' Aironet 350 wireless networks", "keyphrases": ["CareGroup Healthcare System", "healthcare network", "wireless", "medical errors", "provider order entry", "InterSystems Corporation CACHE database", "Cisco Systems Aironet 350 wireless networks", "Dell Computer C600 laptops"], "prmu": ["P", "P", "P", "P", "P", "R", "R", "P"]} +{"id": "1761", "title": "Superconvergence of discontinuous Galerkin method for nonstationary hyperbolic equation", "abstract": "For the first order nonstationary hyperbolic equation taking the piecewise linear discontinuous Galerkin solver, we prove that under the uniform rectangular partition, such a discontinuous solver, after postprocessing, can have two and half approximative order which is half order higher than the optimal estimate by P. Lesaint and P. Raviart (1974) under the rectangular partition", "keyphrases": ["superconvergence of discontinuous Galerkin method", "nonstationary hyperbolic equation", "piecewise linear discontinuous Galerkin solver", "rectangular partition", "approximative order"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1681", "title": "One and two facility network design revisited", "abstract": "The one facility one commodity network design problem (OFOC) with nonnegative flow costs considers the problem of sending d units of flow from a source to a destination where arc capacity is purchased in batches of C units. The two facility problem (TFOC) is similar, but capacity can be purchased either in batches of C units or one unit. Flow costs are zero. These problems are known to be NP-hard. We describe an exact O(n/sup 3/3/sup n/) algorithm for these problems based on the repeated use of a bipartite matching algorithm. We also present a better lower bound of Omega (n/sup 2k*/) for an earlier Omega (n/sup 2k/) algorithm described in the literature where k = [d/C] and k* = min{k, [(n 2)/2]}. The matching algorithm is faster than this one for k >or= [(n - 2)/2]. Finally, we provide another reformulation of the problem that is quasi integral. This property could be useful in designing a modified version of the simplex method to solve the problem using a sequence of pivots with integer extreme solutions, referred to as the integral simplex method in the literature", "keyphrases": ["one facility one commodity network design problem", "two facility network design", "nonnegative flow costs", "flow costs", "NP-hard problems", "exact algorithm", "bipartite matching algorithm", "lower bound", "quasi integral", "pivots", "integral simplex method"], "prmu": ["P", "P", "P", "P", "R", "R", "P", "P", "P", "P", "P"]} +{"id": "1538", "title": "A heuristic approach to resource locations in broadband networks", "abstract": "In broadband networks, such as ATM, the importance of dynamic migration of data resources is increasing because of its potential to improve performance especially for transaction processing. In environments with migratory data resources, it is necessary to have mechanisms to manage the locations of each data resource. In this paper, we present an algorithm that makes use of system state information and heuristics to manage locations of data resources in a distributed network. In the proposed algorithm, each site maintains information about state of other sites with respect to each data resource of the system and uses it to find: (1) a subset of sites likely to have the requested data resource; and (2) the site where the data resource is to be migrated from the current site. The proposed algorithm enhances its effectiveness by continuously updating system state information stored at each site. It focuses on reducing the overall average time delay needed by the transaction requests to locate and access the migratory data resources. We evaluated the performance of the proposed algorithm and also compared it with one of the existing location management algorithms, by simulation studies under several system parameters such as the frequency of requests generation, frequency of data resource migrations, network topology and scale of network. The experimental results show the effectiveness of the proposed algorithm in all cases", "keyphrases": ["broadband networks", "ATM", "resource locations", "heuristics", "distributed network", "data resource migrations", "network topology"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1912", "title": "A novel preterm respiratory mechanics active simulator to test the performances of neonatal pulmonary ventilators", "abstract": "A patient active simulator is proposed which is capable of reproducing values of the parameters of pulmonary mechanics of healthy newborns and preterm pathological infants. The implemented prototype is able to: (a) let the operator choose the respiratory pattern, times of apnea, episodes of cough, sobs, etc., (b) continuously regulate and control the parameters characterizing the pulmonary system; and, finally, (c) reproduce the attempt of breathing of a preterm infant. Taking into account both the limitation due to the chosen application field and the preliminary autocalibration phase automatically carried out by the proposed device, accuracy and reliability on the order of 1% is estimated. The previously indicated value has to be considered satisfactory in light of the field of application and the small values of the simulated parameters. Finally, the achieved metrological characteristics allow the described neonatal simulator to be adopted as a reference device to test performances of neonatal ventilators and, more specifically, to measure the time elapsed between the occurrence of a potentially dangerous condition to the patient and the activation of the corresponding alarm of the tested ventilator", "keyphrases": ["preterm respiratory mechanics active simulator", "neonatal pulmonary ventilators", "patient active simulator", "healthy newborns", "preterm pathological infants", "apnea times", "autocalibration phase", "accuracy", "reliability", "respiratory diseases", "ventilatory support", "intensive care equipment", "electronic unit", "pneumatic/mechanical unit", "software control", "double compartment model", "artificial trachea", "pressure transducer", "variable clamp resistance", "upper airway resistance", "compliance"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "P", "P", "M", "U", "U", "U", "U", "M", "U", "U", "U", "U", "U", "U"]} +{"id": "190", "title": "On the design of gain-scheduled trajectory tracking controllers [AUV application]", "abstract": "A new methodology is proposed for the design of trajectory tracking controllers for autonomous vehicles. The design technique builds on gain scheduling control theory. An application is made to the design of a trajectory tracking controller for a prototype autonomous underwater vehicle (AUV). The effectiveness and advantages of the new control laws derived are illustrated in simulation using a full set of non-linear equations of motion of the vehicle", "keyphrases": ["gain-scheduled trajectory tracking controller design", "autonomous vehicles", "gain scheduling control theory", "autonomous underwater vehicle", "control laws", "nonlinear equations of motion"], "prmu": ["R", "P", "P", "P", "P", "M"]} +{"id": "1639", "title": "New hub gears up for algorithmic exchange", "abstract": "Warwick University in the UK is on the up and up. Sometimes considered a typical 1960s, middle-of-the-road redbrick institution-not known for their distinction the 2001 UK Research Assessment Exercise (RAE) shows its research to be the fifth most highly-rated in the country, with outstanding standards in the sciences. This impressive performance has rightly given Warwick a certain amount of muscle, which it is flexing rather effectively, aided by a snappy approach to making things happen that leaves some older institutions standing. The result is a brand new Centre for Scientific Computing (CSC), launched within a couple of years of its initial conception", "keyphrases": ["Warwick University Centre for Scientific Computing"], "prmu": ["R"]} +{"id": "1641", "title": "Development through gaming", "abstract": "Mainstream observers commonly underestimate the role of fringe activities in propelling science and technology. Well-known examples are how wars have fostered innovation in areas such as communications, cryptography, medicine and aerospace; and how erotica has been a major factor in pioneering visual media, from the first printed books to photography, cinematography, videotape, or the latest online video streaming. The article aims to be a sampler of a less controversial, but still often underrated, symbiosis between scientific computing and computing for leisure and entertainment", "keyphrases": ["computer games", "scientific computing", "leisure", "entertainment", "graphics"], "prmu": ["R", "P", "P", "P", "U"]} +{"id": "1604", "title": "Improving supply-chain performance by sharing advance demand information", "abstract": "In this paper, we analyze how sharing advance demand information (ADI) can improve supply-chain performance. We consider two types of ADI, aggregated ADI (A-ADI) and detailed ADI (D-ADI). With A-ADI, customers share with manufacturers information about whether they will place an order for some product in the next time period, but do not share information about which product they will order and which of several potential manufacturers will receive the order. With D-ADI, customers additionally share information about which product they will order, but which manufacturer will receive the order remains uncertain. We develop and solve mathematical models of supply chains where ADI is shared. We derive exact expressions and closed-form approximations for expected costs, expected base-stock levels, and variations of the production quantities. We show that both the manufacturer and the customers benefit from sharing ADI, but that sharing ADI increases the bullwhip effect. We also show that under certain conditions it is optimal to collect ADI from either none or all of the customers. We study two supply chains in detail: a supply chain with an arbitrary number of products that have identical demand rates, and a supply chain with two products that have arbitrary demand rates. For these two supply chains, we analyze how the values of A-ADI and D-ADI depend on the characteristics of the supply chain and on the quality of the shared information, and we identify conditions under which sharing A-ADI and D-ADI can significantly reduce cost. Our results can be used by decision makers to analyze the cost savings that can be achieved by sharing ADI and help them to determine if sharing ADI is beneficial for their supply chains", "keyphrases": ["supply-chain performance improvement", "advance demand information", "aggregated ADI", "detailed ADI", "information sharing", "manufacturing", "mathematical models", "closed-form approximations", "expected costs", "expected base-stock levels", "production quantity variations", "bullwhip effect", "arbitrary product number", "identical demand rates", "arbitrary demand rates", "shared information quality", "decision makers", "cost savings", "forecasting"], "prmu": ["R", "P", "P", "P", "R", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "R", "P", "P", "U"]} +{"id": "150", "title": "Model checking games for branching time logics", "abstract": "This paper defines and examines model checking games for the branching time temporal logic CTL*. The games employ a technique called focus which enriches sets by picking out one distinguished element. This is necessary to avoid ambiguities in the regeneration of temporal operators. The correctness of these games is proved, and optimizations are considered to obtain model checking games for important fragments of CTL*. A game based model checking algorithm that matches the known lower and upper complexity bounds is sketched", "keyphrases": ["model checking games", "branching time logics", "temporal logic", "temporal operators", "complexity bounds"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1540", "title": "Adaptive thinning for bivariate scattered data", "abstract": "This paper studies adaptive thinning strategies for approximating a large set of scattered data by piecewise linear functions over triangulated subsets. Our strategies depend on both the locations of the data points in the plane, and the values of the sampled function at these points - adaptive thinning. All our thinning strategies remove data points one by one, so as to minimize an estimate of the error that results by the removal of a point from the current set of points (this estimate is termed \"anticipated error\"). The thinning process generates subsets of \"most significant\" points, such that the piecewise linear interpolants over the Delaunay triangulations of these subsets approximate progressively the function values sampled at the original scattered points, and such that the approximation errors are small relative to the number of points in the subsets. We design various methods for computing the anticipated error at reasonable cost, and compare and test the performance of the methods. It is proved that for data sampled from a convex function, with the strategy of convex triangulation, the actual error is minimized by minimizing the best performing measure of anticipated error. It is also shown that for data sampled from certain quadratic polynomials, adaptive thinning is equivalent to thinning which depends only on the locations of the data points - nonadaptive thinning. Based on our numerical tests and comparisons, two practical adaptive thinning algorithms are proposed for thinning large data sets, one which is more accurate and another which is faster", "keyphrases": ["adaptive thinning", "scattered data", "piecewise linear functions", "triangulated subsets", "error", "Delaunay triangulations", "convex function"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1505", "title": "Modeling and simulating practices, a work method for work systems design", "abstract": "Work systems involve people engaging in activities over time-not just with each other, but also with machines, tools, documents, and other artifacts. These activities often produce goods, services, or-as is the case in the work system described in this article-scientific data. Work systems and work practice evolve slowly over time. The integration and use of technology, the distribution and collocation of people, organizational roles and procedures, and the facilities where the work occurs largely determine this evolution", "keyphrases": ["work practice simulation", "work practice modeling", "work system design method", "complex system interactions", "human activity", "communication", "collaboration", "teamwork", "tool usage", "workspace usage", "problem solving", "learning behavior"], "prmu": ["R", "R", "R", "M", "M", "U", "U", "U", "M", "U", "U", "U"]} +{"id": "1877", "title": "Strong completeness of lattice-valued logic", "abstract": "This paper shows strong completeness of the system L for lattice valued logic given by S. Titani (1999), in which she formulates a lattice-valued set theory by introducing the logical implication which represents the order relation on the lattice. Syntax and semantics concerned are described and strong completeness is proved", "keyphrases": ["strong completeness", "lattice-valued set theory", "order relation", "semantics", "syntax", "lattice-valued logic"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1832", "title": "A linear time algorithm for recognizing regular Boolean functions", "abstract": "A positive (or monotone) Boolean function is regular if its variables are naturally ordered, left to fight, by decreasing strength, so that shifting the nonzero component of any true vector to the left always yields another true vector. This paper considers the problem of recognizing whether a positive function f is regular, where f is given by min T(f) (the set of all minimal true vectors of f). We propose a simple linear time (i.e., O(n|min T(f)|)-time) algorithm for it. This improves upon the previous algorithm by J.S. Provan and M.O. Ball (1988) which requires O(n/sup 2/|min T(f)|) time. As a corollary, we also present an O(n(n+|min T(f)|))-time algorithm for the recognition problem of 2-monotonic functions", "keyphrases": ["linear time algorithm", "regular Boolean functions", "monotone Boolean function", "nonzero component", "true vector", "positive function", "2-monotonic functions"], "prmu": ["P", "P", "R", "P", "P", "P", "P"]} +{"id": "1719", "title": "The UPS as network management tool", "abstract": "Uninterrupted power supplies (UPS), or battery backup systems, once provided a relatively limited, although important, function-continual battery support to connected equipment in the event of a power failure. However, yesterday's \"battery in a box\" has evolved into a sophisticated network power management tool that can monitor and actively correct many of the problems that might plague a healthy network. This new breed of UPS system provides such features as automatic voltage regulation, generous runtimes and unattended system shutdown, and now also monitors and automatically restarts critical services and operating systems if they lock up or otherwise fail", "keyphrases": ["uninterrupted power supplies", "network power management", "unattended system shutdown", "automatic voltage regulation"], "prmu": ["P", "P", "P", "P"]} +{"id": "174", "title": "The BIOGENES system for knowledge-based bioprocess control", "abstract": "The application of knowledge-based control systems in the area of biotechnological processes has become increasingly popular over the past decade. This paper outlines the structure of the advanced knowledge-based part of the BIOGENES Copyright control system for the control of bioprocesses such as the fed-batch Saccharomyces cerevisiae cultivation. First, a brief overview of all the tasks implemented in the knowledge-based level including process data classification, qualitative process state identification and supervisory process control is given. The procedures performing the on-line identification of metabolic states and supervisory process control (setpoint calculation and control strategy selection) are described in more detail. Finally, the performance of the system is discussed using results obtained from a number of experimental cultivation runs in a laboratory unit", "keyphrases": ["BIOGENES system", "knowledge-based bioprocess control", "biotechnological processes", "fed-batch Saccharomyces cerevisiae cultivation", "process data classification", "qualitative process state identification", "supervisory process control", "online identification", "metabolic states", "experiment"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M", "P", "U"]} +{"id": "1564", "title": "Asymptotic normality for the K/sub phi /-divergence goodness-of-fit tests", "abstract": "In this paper for a wide class of goodness-of-fit statistics based K/sub phi /-divergences, the asymptotic normality is established under the assumption n/m/sub n/ to a in (0, infinity ), where n denotes sample size and m/sub n/ the number of cells. This result is extended to contiguous alternatives to study asymptotic efficiency", "keyphrases": ["asymptotic normality", "asymptotic efficiency", "K/sub phi /-divergence goodness-of-fit tests"], "prmu": ["P", "P", "P"]} +{"id": "1521", "title": "Optimal multi-degree reduction of Bezier curves with constraints of endpoints continuity", "abstract": "Given a Bezier curve of degree n, the problem of optimal multi-degree reduction (degree reduction of more than one degree) by a Bezier curve of degree m (mor=0) orders can be preserved at two endpoints respectively. The method in the paper performs multi-degree reduction at one time and does not need stepwise computing. When applied to multi-degree reduction with endpoint continuity of any order, the MDR by L/sub 2/ obtains the best least squares approximation. Comparison with another method of multi-degree reduction (MDR by L/sub infinity /), which achieves the nearly best uniform approximation with respect to L/sub infinity / norm, is also given. The approximate effect of the MDR by L/sub 2/ is better than that of the MDR by L/sub infinity /. Explicit approximate error analysis of the multi-degree reduction methods is presented", "keyphrases": ["optimal multi-degree reduction", "Bezier curves", "endpoint continuity constraints", "approximate method", "explicit solution", "endpoint interpolation", "least squares approximation", "uniform approximation", "explicit approximate error analysis"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1698", "title": "Exact frequency-domain reconstruction for thermoacoustic tomography. I. Planar geometry", "abstract": "We report an exact and fast Fourier-domain reconstruction algorithm for thermoacoustic tomography in a planar configuration assuming thermal confinement and constant acoustic speed. The effects of the finite size of the detector and the finite length of the excitation pulse are explicitly included in the reconstruction algorithm. The algorithm is numerically and experimentally verified. We also demonstrate that the blurring caused by the finite size of the detector surface is the primary limiting factor on the resolution and that it can be compensated for by deconvolution", "keyphrases": ["medical diagnostic imaging", "exact frequency-domain reconstruction", "planar configuration", "thermal confinement", "constant acoustic speed", "blurring", "finite detector surface size", "primary limiting factor", "deconvolution", "resolution limitation", "excitation pulse", "reconstruction algorithm", "thermoacoustic tomography", "planar geometry"], "prmu": ["U", "P", "P", "P", "P", "P", "R", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1665", "title": "How airlines and airports recover from schedule perturbations: a survey", "abstract": "The explosive growth in air traffic as well as the widespread adoption of Operations Research techniques in airline scheduling has given rise to tight flight schedules at major airports. An undesirable consequence of this is that a minor incident such as a delay in the arrival of a small number of flights can result in a chain reaction of events involving several flights and airports, causing disruption throughout the system. This paper reviews recent literature in the area of recovery from schedule disruptions. First we review how disturbances at a given airport could be handled, including the effects of runways and fixes. Then we study the papers on recovery from airline schedule perturbations, which involve adjustments in flight schedules, aircraft, and crew. The mathematical programming techniques used in ground holding are covered in some detail. We conclude the review with suggestions on how singular perturbation theory could play a role in analyzing disruptions to such highly sensitive schedules as those in the civil aviation industry", "keyphrases": ["air traffic management", "schedule perturbation", "operations research techniques", "airline scheduling", "tight flight schedules", "airports", "schedule disruptions", "recovery", "disturbance handling", "runways", "flight schedule adjustments", "aircraft adjustments", "crew adjustments", "mathematical programming techniques", "ground holding", "singular perturbation theory", "civil aviation industry"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P", "R", "P", "R", "R", "R", "P", "P", "P", "P"]} +{"id": "1620", "title": "Rapid Cauer filter design employing new filter model", "abstract": "The exact three-dimensional (3D) design of a coaxial Cauer filter employing a new filter model, a 3D field simulator and a circuit simulator, is demonstrated. Only a few iterations between the field simulator and the circuit simulator are necessary to meet a given specification", "keyphrases": ["Cauer filter", "filter design", "filter model", "3D design", "coaxial filter", "field simulator", "circuit simulator", "iterations", "bandpass filters"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "P", "M"]} +{"id": "1599", "title": "Evaluating the best main battle tank using fuzzy decision theory with linguistic criteria evaluation", "abstract": "In this paper, experts' opinions are described in linguistic terms which can be expressed in trapezoidal (or triangular) fuzzy numbers. To make the consensus of the experts consistent, we utilize the fuzzy Delphi method to adjust the fuzzy rating of every expert to achieve the consensus condition. For the aggregate of many experts' opinions, we take the operation of fuzzy numbers to get the mean of fuzzy rating, x/sub ij/ and the mean of weight, w/sub .j/. In multi-alternatives and multi-attributes cases, the fuzzy decision matrix X=[x/sub ij/]/sub m*n/ is constructed by means of the fuzzy rating, x/sub ij/. Then, we can derive the aggregate fuzzy numbers by multiplying the fuzzy decision matrix with the corresponding fuzzy attribute weights. The final results become a problem of ranking fuzzy numbers. We also propose an easy procedure of using fuzzy numbers to rank aggregate fuzzy numbers A/sub i/. In this way, we can obtain the best selection for evaluating the system. For practical application, we propose an algorithm for evaluating the best main battle tank by fuzzy decision theory and comparing it with other methods", "keyphrases": ["battle tank evaluation", "fuzzy group decision making", "fuzzy decision theory", "linguistic criteria evaluation", "multiple criteria problems", "group decision making", "subjective-objective backgrounds", "trapezoidal fuzzy numbers", "triangular fuzzy numbers", "fuzzy Delphi method", "fuzzy rating", "consensus condition", "fuzzy number ranking", "fuzzy decision matrix", "aggregate fuzzy numbers", "fuzzy attribute weights"], "prmu": ["R", "M", "P", "P", "M", "M", "U", "R", "R", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "189", "title": "Identification of linear parameter varying models", "abstract": "We consider identification of a certain class of discrete-time nonlinear systems known as linear parameter varying system. We assume that inputs, outputs and the scheduling parameters are directly measured, and a form of the functional dependence of the system coefficients on the parameters is known. We show how this identification problem can be reduced to a linear regression, and provide compact formulae for the corresponding least mean square and recursive least-squares algorithms. We derive conditions on persistency of excitation in terms of the inputs and scheduling parameter trajectories when the functional dependence is of polynomial type. These conditions have a natural polynomial interpolation interpretation, and do not require the scheduling parameter trajectories to vary slowly. This method is illustrated with a simulation example using two different parameter trajectories", "keyphrases": ["linear parameter varying models", "identification", "discrete-time nonlinear systems", "scheduling parameters", "functional dependence", "system coefficients", "linear regression", "least mean square algorithms", "recursive least-squares algorithms", "persistency of excitation conditions", "scheduling parameter trajectories", "polynomial interpolation interpretation", "parameter trajectories", "time-varying systems"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "P", "M"]} +{"id": "1778", "title": "HeLIN pilot mentoring scheme", "abstract": "The health care libraries unit coordinates, facilitates, and promotes continuing personal development for all staff in the Health Libraries and Information Network (HeLIN) of the Oxford Deanery (UK). It supports the development of a culture of lifelong learning and recognizes that CPD should help deliver organizational objectives, as well as enabling all staff to expand and fulfill their potential. A major emphasis for 2000 was to investigate ways of improving support for individual learning within the workplace. The group identified a need to build on existing informal support networks in order to provide additional learning opportunities and decided to investigate the feasibility of piloting a mentoring scheme. The objectives of the pilot were to increase understanding and knowledge of mentoring as a tool for CPD; to investigate existing mentoring schemes and their applicability for HeLIN; to develop a pilot mentoring scheme for HeLIN incorporating a program for accreditation of mentors; and to evaluate the scheme and disseminate the results. In order to identify current practice in this area, a literature review was carried out, and colleagues with an interest in or existing knowledge of mentoring schemes were contacted where possible. In the absence of clearly defined appraisal tools, all abstracts were read, and articles that met the following criteria were obtained and distributed to the group for review", "keyphrases": ["HeLIN pilot mentoring scheme", "health care libraries unit", "continuing personal development", "staff", "Health Libraries and Information Network", "lifelong learning", "informal support networks", "accreditation", "literature review", "midcareer librarians"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "U"]} +{"id": "1853", "title": "CherylAnn Silberer: all about process [accounting technologist]", "abstract": "Silberer's company, CompLete, is making a specialty of workflow process analysis", "keyphrases": ["CompLete", "workflow process analysis", "accounting technologist"], "prmu": ["P", "P", "P"]} +{"id": "1816", "title": "Hamiltonian modelling and nonlinear disturbance attenuation control of TCSC for improving power system stability", "abstract": "To tackle the obstacle of applying passivity-based control (PBC) to power systems, an affine non-linear system widely existing in power systems is formulated as a standard Hamiltonian system using a pre-feedback method. The port controlled Hamiltonian with dissipation (PCHD) model of a thyristor controlled serial compensator (TCSC) is then established corresponding with a revised Hamiltonian function. Furthermore, employing the modified Hamiltonian function directly as the storage function, a non-linear adaptive L/sub 2/ gain control method is proposed to solve the problem of L/sub 2/ gain disturbance attenuation for this Hamiltonian system with parametric perturbations. Finally, simulation results are presented to verify the validity of the proposed controller", "keyphrases": ["Hamiltonian modelling", "thyristor controlled serial compensator", "nonlinear disturbance attenuation control", "power system stability", "passivity-based control", "affine nonlinear system", "pre-feedback method", "port controlled Hamiltonian with dissipation model", "Hamiltonian function", "storage function", "nonlinear adaptive L/sub 2/ gain control method", "parametric perturbations"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "R", "P"]} +{"id": "1484", "title": "Portfolio optimization and the random magnet problem", "abstract": "Diversification of an investment into independently fluctuating assets reduces its risk. In reality, movements of assets are mutually correlated and therefore knowledge of cross-correlations among asset price movements are of great importance. Our results support the possibility that the problem of finding an investment in stocks which exposes invested funds to a minimum level of risk is analogous to the problem of finding the magnetization of a random magnet. The interactions for this \"random magnet problem\" are given by the cross-correlation matrix C of stock returns. We find that random matrix theory allows us to make an estimate for C which outperforms the standard estimate in terms of constructing an investment which carries a minimum level of risk", "keyphrases": ["portfolio optimization", "fluctuating assets", "cross-correlations", "price movements", "investment", "stocks", "invested funds", "magnetization", "cross-correlation matrix", "minimum risk level", "spin glasses", "random magnet problem"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "P"]} +{"id": "1479", "title": "Agreeing with automated diagnostic aids: a study of users' concurrence strategies", "abstract": "Automated diagnostic aids that are less than perfectly reliable often produce unwarranted levels of disuse by operators. In the present study, users' tendencies to either agree or disagree with automated diagnostic aids were examined under conditions in which: (1) the aids were less than perfectly reliable but aided-diagnosis was still more accurate that unaided diagnosis; and (2) the system was completely opaque, affording users no additional information upon which to base a diagnosis. The results revealed that some users adopted a strategy of always agreeing with the aids, thereby maximizing the number of correct diagnoses made over several trials. Other users, however, adopted a probability-matching strategy in which agreement and disagreement rates matched the rate of correct and incorrect diagnoses of the aids. The probability-matching strategy, therefore, resulted in diagnostic accuracy scores that were lower than was maximally possible. Users who adopted the maximization strategy had higher self-ratings of problem-solving and decision-making skills, were more accurate in estimating aid reliabilities, and were more confident in their diagnosis on trials in which they agreed with the aids. The potential applications of these findings include the design of interface and training solutions that facilitate the adoption of the most effective concurrence strategies by users of automated diagnostic aids", "keyphrases": ["automated diagnostic aids", "user concurrence strategy", "probability-matching", "disagreement rates", "maximization", "problem-solving", "reliability", "complex systems", "fault diagnosis"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "M", "M"]} +{"id": "1785", "title": "The effect of a male-oriented computer gaming culture on careers in the computer industry", "abstract": "If careers in the computer industry were viewed, it would be evident that there is a conspicuous gender gap between the number of male and female employees. The same gap can be observed at the college level where males are dominating females as to those who pursue and obtain a degree in computer science. The question that this research paper intends to show is: why are males so dominant when it comes to computer related matters? The author has traced this question back to the computer game. Computer games are a fun medium and provide the means for an individual to become computer literate through the engagement of spatial learning and cognitive processing abilities. Since such games are marketed almost exclusively to males, females have a distinct disadvantage. Males are more computer literate through the playing of computer games, and are provided with an easy lead-in to more advanced utilization of computers such as programming. Females tend to be turned off due to the male stereotypes and marketing associated with games and thus begins the gender gap", "keyphrases": ["careers", "computer industry", "gender gap", "computer science degree", "computer games", "computer literacy", "female employees", "spatial learning", "cognitive processing", "male stereotypes", "marketing"], "prmu": ["P", "P", "P", "R", "P", "M", "P", "P", "P", "P", "P"]} +{"id": "1893", "title": "Closed-loop model set validation under a stochastic framework", "abstract": "Deals with probabilistic model set validation. It is assumed that the dynamics of a multi-input multi-output (MIMO) plant is described by a model set with unstructured uncertainties, and identification experiments are performed in closed loop. A necessary and sufficient condition has been derived for the consistency of the model set with both the stabilizing controller and closed-loop frequency domain experimental data (FDED). In this condition, only the Euclidean norm of a complex vector is involved, and this complex vector depends linearly on both the disturbances and the measurement errors. Based on this condition, an analytic formula has been derived for the sample unfalsified probability (SUP) of the model set. Some of the asymptotic statistical properties of the SUP have also been briefly discussed. A numerical example is included to illustrate the efficiency of the suggested method in model set quality evaluation", "keyphrases": ["closed-loop model set validation", "stochastic framework", "probabilistic model set validation", "multi-input multi-output plant", "MIMO plant", "unstructured uncertainties", "necessary and sufficient condition", "stabilizing controller", "closed-loop frequency domain experimental data", "Euclidean norm", "complex vector", "asymptotic statistical properties", "robust control", "unstructured uncertainty"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "P", "P", "P", "P", "P", "M", "P"]} +{"id": "1700", "title": "Computation of unmeasured third-generation VCT views from measured views", "abstract": "We compute unmeasured cone-beam projections from projections measured by a third-generation helical volumetric computed tomography system by solving a characteristic problem for an ultrahyperbolic differential equation [John (1938)]. By working in the Fourier domain, we convert the second-order PDE into a family of first-order ordinary differential equations. A simple first-order integration is used to solve the ODES", "keyphrases": ["unmeasured third-generation VCT views computation", "measured views", "cone-beam projections", "characteristic problem solution", "ultrahyperbolic differential equation", "Fourier domain", "first-order ordinary differential equations", "simple first-order integration", "medical diagnostic imaging", "range conditions", "third-generation helical volumetric computed tomography system"], "prmu": ["R", "P", "P", "M", "P", "P", "P", "P", "U", "U", "P"]} +{"id": "1745", "title": "Approximate relaxed descent method for optimal control problems", "abstract": "We consider an optimal control problem for systems governed by ordinary differential equations with control constraints. Since no convexity assumptions are made on the data, the problem is reformulated in relaxed form. The relaxed state equation is discretized by the implicit trapezoidal scheme and the relaxed controls are approximated by piecewise constant relaxed controls. We then propose a combined descent and discretization method that generates sequences of discrete relaxed controls and progressively refines the discretization. Since here the adjoint of the discrete state equation is not defined, we use, at each iteration, an approximate derivative of the cost functional defined by discretizing the continuous adjoint equation and the integral involved by appropriate trapezoidal schemes. It is proved that accumulation points of sequences constructed by this method satisfy the strong relaxed necessary conditions for optimality for the continuous problem. Finally, the computed relaxed controls can be easily approximated by piecewise constant classical controls", "keyphrases": ["approximate relaxed descent method", "optimal control problems", "ordinary differential equations", "relaxed state equation discretization", "implicit trapezoidal scheme", "piecewise constant relaxed controls", "relaxed control approximation", "discrete relaxed control sequences", "discretization refinement", "discrete state equation", "cost functional approximate derivative", "trapezoidal schemes"], "prmu": ["P", "P", "P", "R", "P", "P", "R", "R", "R", "P", "R", "P"]} +{"id": "1658", "title": "Chaos theory as a framework for studying information systems", "abstract": "This paper introduces chaos theory as a means of studying information systems. It argues that chaos theory, combined with new techniques for discovering patterns in complex quantitative and qualitative evidence, offers a potentially more substantive approach to understand the nature of information systems in a variety of contexts. The paper introduces chaos theory concepts by way of an illustrative research design", "keyphrases": ["chaos theory", "information systems", "pattern discovery", "complex quantitative evidence", "qualitative evidence"], "prmu": ["P", "P", "M", "R", "P"]} +{"id": "149", "title": "Extending Kamp's theorem to model time granularity", "abstract": "In this paper, a generalization of Kamp's theorem relative to the functional completeness of the until operator is proved. Such a generalization consists in showing the functional completeness of more expressive temporal operators with respect to the extension of the first-order theory of linear orders MFO[<] with an extra binary relational symbol. The result is motivated by the search of a modal language capable of expressing properties and operators suitable to model time granularity in omega -layered temporal structures", "keyphrases": ["Kamp's theorem", "functional completeness", "until operator", "temporal operators", "first-order theory", "linear orders", "binary relational symbol", "omega -layered temporal structures", "model time granularity"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1559", "title": "A comparison theorem for the iterative method with the preconditioner (I + S/sub max/)", "abstract": "A.D. Gunawardena et al. (1991) have reported the modified Gauss-Seidel method with a preconditioner (I + S). In this article, we propose to use a preconditioner (I + S/sub max/) instead of (I + S). Here, S/sub max/ is constructed by only the largest element at each row of the upper triangular part of A. By using the lemma established by M. Neumann and R.J. Plemmons (1987), we get the comparison theorem for the proposed method. Simple numerical examples are also given", "keyphrases": ["iterative method", "preconditioner", "modified Gauss-Seidel method", "comparison theorem"], "prmu": ["P", "P", "P", "P"]} +{"id": "1781", "title": "Making it to the major leagues: career movement between library and archival professions and from small college to large university libraries", "abstract": "Issues of career movement and change are examined between library and archival fields and from small colleges to large universities. Issues examined include professional education and training, initial career-planning and placement, continuing education, scouting and mentoring, job market conditions, work experience and personal skills, professional involvement, and professional association self-interest. This examination leads to five observations: 1. It is easier, in terms of career transitions, for a librarian to become an archivist than it is for an archivist to become a librarian; 2. The progression from a small college venue to a large research university is very manageable with the proper planning and experience; 3. At least three of the career elements-professional education, career-planning, and professional association self-interest-in their best moments provide a foundation that enables a future consideration of change between institutional types and professional areas and in their worst moments conspire against the midcareer professional in terms of change; 4. The elements of scouting, continuing education, work experience, and professional involvement offer the greatest assistance in career transitions; 5. The job market is the wildcard that either stymies or stimulates occupational development", "keyphrases": ["career movement", "library profession", "archival profession", "small college library", "large university libraries", "professional education", "training", "continuing education", "job market", "work experience", "personal skills", "librarian", "occupational development", "midcareer"], "prmu": ["P", "R", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1739", "title": "Application of normal possibility decision rule to silence", "abstract": "The paper presents the way of combining two decision problems concerning a single (or a common) dimension, so that an effective fuzzy decision rule can be obtained. Normality of the possibility distribution is assumed, leading to possibility of fusing the respective functions related to the two decision problems and their characteristics (decisions, states of nature, utility functions, etc.). The approach proposed can be applied in cases when the statement of the problem requires making of more refined distinctions rather than considering simply a bi-criterion or bi-utility two-decision problem", "keyphrases": ["normal possibility decision rule", "silence", "conflicting objectives", "conflicting utilities", "cool head", "warm heart", "decision problems", "two-dimensional fuzzy events"], "prmu": ["P", "P", "U", "M", "U", "U", "P", "M"]} +{"id": "1812", "title": "Computing the frequency response of systems affinely depending on uncertain parameters", "abstract": "The computation of the frequency response of systems depending affinely on uncertain parameters can be reduced to that of all its one-dimensional edge plants while the image of such an edge plant at a fixed frequency is an arc or a line segment in the complex plane. Based on this conclusion, four computational formulas of the maximal and minimal (maxi-mini) magnitudes and phases of an edge plant at a fixed frequency are given. The formulas, besides sharing a simpler form of expression, concretely display how the extrema of the frequency response of the edge plant relate to the typical characteristics of the arc and line segment such as the centre, radius and tangent points of the arc, the distance from the origin to the line segment etc. The direct application of the results is to compute the Bode-, Nichols- and Nyquist-plot collections of the systems which are needed in robustness analysis and design", "keyphrases": ["frequency response", "uncertain parameters", "affine systems", "one-dimensional edge plants", "arc", "line segment", "Bode-plot", "Nichols-plot", "Nyquist-plot", "robustness analysis", "robustness design", "frequency-domain design methods"], "prmu": ["P", "P", "R", "P", "P", "P", "U", "U", "P", "P", "R", "M"]} +{"id": "1480", "title": "Formal verification of human-automation interaction", "abstract": "This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training materials (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces", "keyphrases": ["formal verification", "human-automation interaction", "man-machine interaction", "automated control systems", "user interface", "autopilot", "commercial aircraft"], "prmu": ["P", "P", "M", "P", "R", "P", "P"]} +{"id": "1661", "title": "The road to perpetual progress [retail inventory management]", "abstract": "With annual revenues increasing 17.0% to 20.0% consistently over the last three years and more than 2,500 new stores opened from 1998 through 2001, Dollar General is on the fast track. However, the road to riches could have easily become the road to ruin had the retailer not exerted control over its inventory management", "keyphrases": ["Dollar General", "retailer", "inventory management"], "prmu": ["P", "P", "P"]} +{"id": "1624", "title": "Genetic algorithm for input/output selection in MIMO systems based on controllability and observability indices", "abstract": "A time domain optimisation algorithm using a genetic algorithm in conjunction with a linear search scheme has been developed to find the smallest or near-smallest subset of inputs and outputs to control a multi-input-multi-output system. Experimental results have shown that this proposed algorithm has a very fast convergence rate and high computation efficiency", "keyphrases": ["genetic algorithm", "input/output selection", "MIMO systems", "controllability indices", "observability indices", "time domain optimisation algorithm", "linear search scheme", "near-smallest subset", "smallest subset", "multi-input-multi-output system", "very fast convergence", "high computation efficiency", "multivariable control systems"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "R", "P", "P", "P", "M"]} +{"id": "170", "title": "The impact of the product mix on the value of flexibility", "abstract": "Product-mix flexibility is one of the major types of manufacturing flexibility, referring to the ability to produce a broad range of products or variants with presumed low changeover costs. The value of such a capability is important to establish for an industrial firm in order to ensure that the flexibility provided will be at the right level and used profitably rather than in excess of market requirements and consequently costly. We use option-pricing theory to analyse the impact of various product-mix issues on the value of flexibility. The real options model we use incorporates multiple products, capacity constraints as well as set-up costs. The issues treated here include the number of products, demand variability, correlation between products, and the relative demand distribution within the product mix. Thus, we are interested in the nature of the input data to analyse its effect on the value of flexibility. We also check the impact at different capacity levels. The results suggest that the value of flexibility (i) increases with an increasing number of products, (ii) decreases with increasing volatility of product demand, (iii) decreases the more positively correlated the demand is, and (iv) reduces for marginal capacity with increasing levels of capacity. Of these, the impact of positively correlated demand seems to be a major issue. However, the joint impact of the number of products and demand correlation showed some non-intuitive results", "keyphrases": ["product-mix flexibility", "flexible manufacturing", "manufacturing flexibility", "low changeover costs", "industrial firm", "option-pricing theory", "real options model", "multiple products", "capacity constraints", "set-up costs", "demand variability", "product correlation", "relative demand distribution", "product demand volatility", "marginal capacity", "positively correlated demand", "demand correlation", "capital budgeting"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "R", "P", "P", "P", "U"]} +{"id": "1560", "title": "Determinantal solutions of solvable chaotic systems", "abstract": "It is shown that two solvable chaotic systems, the arithmetic-harmonic mean (ARM) algorithm and the Ulam-von Neumann (UvN) map, have determinantal solutions. An additional formula for certain determinants and Riccati difference equations play a key role in both cases. Two infinite hierarchies of solvable chaotic systems are presented which have determinantal solutions", "keyphrases": ["determinantal solutions", "arithmetic-harmonic mean algorithm", "solvable chaotic systems", "Ulam-von Neumann map", "determinants", "Riccati difference equations", "Chebyshev polynomial"], "prmu": ["P", "R", "P", "R", "P", "P", "U"]} +{"id": "1525", "title": "Dependence graphs: dependence within and between groups", "abstract": "This paper applies the two-party dependence theory (Castelfranchi, Cesta and Miceli, 1992, in Y. Demazeau and E. Werner (Eds.) Decentralized AI-3, Elsevier, North Holland) to modelling multiagent and group dependence. These have theoretical potentialities for the study of emerging groups and collective structures, and more generally for understanding social and organisational complexity, and practical utility for both social-organisational and agent systems purposes. In the paper, the dependence theory is extended to describe multiagent links, with a special reference to group and collective phenomena, and is proposed as a framework for the study of emerging social structures, such as groups and collectives. In order to do so, we propose to extend the notion of dependence networks (applied to a single agent) to dependence graphs (applied to an agency). In its present version, the dependence theory is argued to provide (a) a theoretical instrument for the study of social complexity, and (b) a computational system for managing the negotiation process in competitive contexts and for monitoring complexity in organisational and other cooperative contexts", "keyphrases": ["dependence graphs", "group dependence", "two-party dependence theory", "multiagent dependence", "emerging groups", "collective structures", "multiagent systems", "organisational complexity", "social complexity", "agent systems", "dependence networks"], "prmu": ["P", "P", "P", "R", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1518", "title": "Explicit matrix representation for NURBS curves and surfaces", "abstract": "The matrix forms for curves and surfaces were largely promoted in CAD/CAM. In this paper we have presented two matrix representation formulations for arbitrary degree NURBS curves and surfaces explicitly other than recursively. The two approaches are derived from the computation of divided difference and the Marsden identity respectively. The explicit coefficient matrix of B-spline with equally spaced knot and Bezier curves and surfaces can be obtained by these formulae. The coefficient formulae and the coefficient matrix formulae developed in this paper express non-uniform B-spline functions of arbitrary degree in explicit polynomial and matrix forms.. They are useful for the evaluation and the conversion of NURBS curves and surfaces, in CAD/CAM systems", "keyphrases": ["explicit matrix representation", "NURBS curves", "NURBS surfaces", "CAD/CAM", "matrix representation formulations", "divided difference", "Marsden identity", "explicit coefficient matrix", "B-spline", "equally spaced knot", "Bezier curves", "Bezier surfaces", "coefficient formulae", "coefficient matrix formulae", "nonuniform B-spline functions", "explicit polynomial forms", "explicit matrix forms"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "M", "R", "R"]} +{"id": "1619", "title": "Rate allocation for video transmission over lossy correlated networks", "abstract": "A novel rate allocation algorithm for video transmission over lossy networks subject to bursty packet losses is presented. A Gilbert-Elliot model is used at the encoder to drive the selection of coding parameters. Experimental results using the H.26L test model show a significant performance improvement with respect to the assumption of independent packet losses", "keyphrases": ["rate allocation algorithm", "video transmission", "lossy correlated networks", "bursty packet losses", "Gilbert-Elliot model", "coding parameters", "H.26L test model", "video coding"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R"]} +{"id": "1704", "title": "Statistical analysis of nonlinearly reconstructed near-infrared tomographic images. I. Theory and simulations", "abstract": "Near-infrared (NIR) diffuse tomography is an emerging method for imaging the interior of tissues to quantify concentrations of hemoglobin and exogenous chromophores noninvasively in vivo. It often exploits an optical diffusion model-based image reconstruction algorithm to estimate spatial property values from measurements of the light flux at the surface of the tissue. In this study, mean-squared error (MSE) over the image is used to evaluate methods for regularizing the ill-posed inverse image reconstruction problem in NIR tomography. Estimates of image bias and image standard deviation were calculated based upon 100 repeated reconstructions of a test image with randomly distributed noise added to the light flux measurements. It was observed that the bias error dominates at high regularization parameter values while variance dominates as the algorithm is allowed to approach the optimal solution. This optimum does not necessarily correspond to the minimum projection error solution, but typically requires further iteration with a decreasing regularization parameter to reach the lowest image error. Increasing measurement noise causes a need to constrain the minimum regularization parameter to higher values in order to achieve a minimum in the overall image MSE", "keyphrases": ["medical diagnostic imaging", "hemoglobin", "oxygen saturation", "photon migration", "optical diffusion model-based image reconstruction algorithm", "decreasing regularization parameter", "lowest image error", "minimum regularization parameter constraint", "bias error", "optimal solution", "light flux", "mean-squared error", "ill-posed inverse image reconstruction problem regularization", "spatial property values estimation", "test image", "randomly distributed noise", "O/sub 2/"], "prmu": ["M", "P", "U", "U", "P", "P", "P", "M", "P", "P", "P", "P", "R", "R", "P", "P", "U"]} +{"id": "1741", "title": "The top cycle and uncovered solutions for weak tournaments", "abstract": "We study axiomatic properties of the top cycle and uncovered solutions for weak tournaments. Subsequently, we establish its connection with the rational choice theory", "keyphrases": ["top cycle", "uncovered solutions", "weak tournaments", "axiomatic properties", "rational choice theory"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1897", "title": "User-appropriate tyre-modelling for vehicle dynamics in standard and limit situations", "abstract": "When modelling vehicles for the vehicle dynamic simulation, special attention must be paid to the modelling of tyre forces and -torques, according to their dominant influence on the results. This task is not only about sufficiently exact representation of the effective forces but also about user-friendly and practical relevant applicability, especially when the experimental tyre-input-data is incomplete or missing. This text firstly describes the basics of the vehicle dynamic tyre model, conceived to be a physically based, semi-empirical model for application in connection with multi-body-systems (MBS). On the basis of tyres for a passenger car and a heavy truck the simulated steady state tyre characteristics are shown together and compared with the underlying experimental values. The possibility to link the tyre model TMeasy to any MBS-program is described, as far as it supports the 'Standard Tyre Interface'. As an example, the simulated and experimental data of a heavy truck doing a standardized driving manoeuvre are compared", "keyphrases": ["tyre modelling", "vehicle dynamics", "standard situations", "limit situations", "tyre torques", "semi-empirical model", "multi-body-systems", "passenger car", "heavy truck", "simulated steady state tyre characteristics", "TMeasy", "Standard Tyre Interface", "standardized driving manoeuvre"], "prmu": ["P", "P", "R", "P", "M", "P", "P", "P", "P", "P", "P", "R", "P"]} +{"id": "1916", "title": "Changes in the entropy and the Tsallis difference information during spontaneous decay and self-organization of nonextensive systems", "abstract": "A theoretical-information description of self-organization processes during stimulated transitions between stationary states of open nonextensive systems is presented. S/sub q/- and I/sub q/-theorems on changes of the entropy and Tsallis difference information measures in the process of evolution in the space of control parameters are proved. The entropy and the Tsallis difference information are derived and their new extreme properties are discussed", "keyphrases": ["entropy", "Tsallis difference information", "spontaneous decay", "self-organization", "nonextensive systems", "stimulated transitions", "information measures", "control parameters", "nonextensive statistical mechanics"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1584", "title": "Content all clear [workflow & content management]", "abstract": "Graeme Muir of SchlumbergerSema cuts through the confusion between content, document and records management", "keyphrases": ["SchlumbergerSema", "content management", "document management", "records management"], "prmu": ["P", "P", "R", "P"]} +{"id": "1678", "title": "Parallel interior point schemes for solving multistage convex programming", "abstract": "The predictor-corrector interior-point path-following algorithm is promising in solving multistage convex programming problems. Among many other general good features of this algorithm, especially attractive is that the algorithm allows the possibility to parallelise the major computations. The dynamic structure of the multistage problems specifies a block-tridiagonal system at each Newton step of the algorithm. A wrap-around permutation is then used to implement the parallel computation for this step", "keyphrases": ["parallel interior point schemes", "multistage convex programming", "predictor-corrector interior-point path-following algorithm", "dynamic structure", "block-tridiagonal system", "Newton step", "wrap-around permutation", "parallel computation"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1685", "title": "Use of web technologies in construction project management: what are the critical success/failure factors?", "abstract": "A concept of how the World Wide Web (WWW) and its associated technologies can be used to manage construction projects has been recognized by practitioners in the construction industry for quite sometime. This concept is often referred to as a Web-Based Project Management System (WPMS). It promises, to enhance construction project documentation and control, and to revolutionize the way construction project teams process and transmit project information. WPMS is an electronic project-management system conducted through the Internet. The system provides a centralized, commonly accessible, reliable means of transmitting and storing project information. Project information is stored on the server and a standard Web browser is used as the gateway to exchange this information, eliminating geographic and hardware platforms boundary", "keyphrases": ["Web-Based Project Management System", "construction industry", "project documentation", "project control", "success", "implementation", "Web browser"], "prmu": ["P", "P", "P", "R", "U", "U", "P"]} +{"id": "169", "title": "MRP in a job shop environment using a resource constrained project scheduling model", "abstract": "One of the most difficult tasks in a job shop manufacturing environment is to balance schedule and capacity in an ongoing basis. MRP systems are commonly used for scheduling, although their inability to deal with capacity constraints adequately is a severe drawback. In this study, we show that material requirements planning can be done more effectively in a job shop environment using a resource constrained project scheduling model. The proposed model augments MRP models by incorporating capacity constraints and using variable lead time lengths. The efficacy of this approach is tested on MRP systems by comparing the inventory carrying costs and resource allocation of the solutions obtained by the proposed model to those obtained by using a traditional MRP model. In general, it is concluded that the proposed model provides improved schedules with considerable reductions in inventory carrying costs", "keyphrases": ["job shop environment", "MRP", "resource constrained project scheduling model", "material requirements planning", "scheduling", "capacity constraints", "variable lead time lengths", "inventory carrying costs", "resource allocation", "project management"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1798", "title": "Robustness evaluation of a minimal RBF neural network for nonlinear-data-storage-channel equalisation", "abstract": "The authors present a performance-robustness evaluation of the recently developed minimal resource allocation network (MRAN) for equalisation in highly nonlinear magnetic recording channels in disc storage systems. Unlike communication systems, equalisation of signals in these channels is a difficult problem, as they are corrupted by data-dependent noise and highly nonlinear distortions. Nair and Moon (1997) have proposed a maximum signal to distortion ratio (MSDR) equaliser for data storage channels, which uses a specially designed neural network, where all the parameters of the neural network are determined theoretically, based on the exact knowledge of the channel model parameters. In the present paper, the performance of the MSDR equaliser is compared with that of the MRAN equaliser using a magnetic recording channel model, under Conditions that include variations in partial erasure, jitter, width and noise power, as well as model mismatch. Results from the study indicate that the less complex MRAN equaliser gives consistently better performance robustness than the MSDR equaliser in terms of signal to distortion ratios (SDRs)", "keyphrases": ["robustness evaluation", "minimal resource allocation network", "highly nonlinear magnetic recording channels", "disc storage systems", "nonlinear-data-storage-channel equalisation", "data-dependent noise", "highly nonlinear distortions", "maximum signal to distortion ratio equaliser", "RBF neural network", "MRAN equaliser", "MSDR equaliser", "digital magnetic recording", "jitter noise"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "M", "R"]} +{"id": "1464", "title": "LR parsing for conjunctive grammars", "abstract": "The generalized LR parsing algorithm for context-free grammars, introduced by Tomita in 1986, is a polynomial-time implementation of nondeterministic LR parsing that uses graph-structured stack to represent the contents of the nondeterministic parser's pushdown for all possible branches of computation at a single computation step. It has been specifically developed as a solution for practical parsing tasks arising in computational linguistics, and indeed has proved itself to be very suitable for natural language processing. Conjunctive grammars extend context-free grammars by allowing the use of an explicit intersection operation within grammar rules. This paper develops a new LR-style parsing algorithm for these grammars, which is based on the very same idea of a graph-structured pushdown, where the simultaneous existence of several paths in the graph is used to perform the mentioned intersection operation. The underlying finite automata are treated in the most general way: instead of showing the algorithm's correctness for some particular way of constructing automata, the paper defines a wide class of automata usable with a given grammar, which includes not only the traditional LR(k) automata, but also, for instance, a trivial automaton with a single reachable state. A modification of the SLR(k) table construction method that makes use of specific properties of conjunctive grammars is provided as one possible way of making finite automata to use with the algorithm", "keyphrases": ["conjunctive grammars", "generalized LR parsing algorithm", "graph-structured stack", "nondeterministic parser pushdown", "computation", "computational linguistics", "natural language processing", "context-free grammars", "explicit intersection operation", "grammar rules", "finite automata", "trivial automaton", "single reachable state", "Boolean closure", "deterministic context-free languages"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "M"]} +{"id": "1499", "title": "A digital-driving system for smart vehicles", "abstract": "In the wake of the computer and information technology revolutions, vehicles are undergoing dramatic changes in their capabilities and how they interact with drivers. Although some vehicles can decide to either generate warnings for the human driver or control the vehicle autonomously, they must usually make these decisions in real time with only incomplete information. So, human drivers must still maintain control over the vehicle. I sketch a digital driving behavior model. By simulating and analyzing driver behavior during different maneuvers such as lane changing, lane following, and traffic avoidance, researchers participating in the Beijing Institute of Technology's digital-driving project will be able to examine the possible correlations or causal relations between the smart vehicle, IVISs, the intelligent road-traffic-information network, and the driver. We aim to successfully demonstrate that a digital-driving system can provide a direction for developing human-centered smart vehicles", "keyphrases": ["digital driving system", "human-centered smart vehicles", "in-vehicle information systems", "intelligence", "intelligent driver-vehicle interface", "ecological driver-vehicle interface", "vehicle control", "interactive communication", "intelligent road traffic information network", "intelligent transportation systems", "maneuvers", "traffic avoidance", "lane following", "lane changing"], "prmu": ["R", "P", "M", "P", "M", "U", "R", "M", "M", "M", "P", "P", "P", "P"]} +{"id": "1765", "title": "On bandlimited scaling function", "abstract": "This paper discusses band-limited scaling function, especially the single interval band case and three interval band cases. Their relationship to oversampling property and weakly translation invariance are also studied. At the end, we propose an open problem", "keyphrases": ["bandlimited scaling function", "interval band case", "oversampling property", "weakly translation invariance"], "prmu": ["P", "P", "P", "P"]} +{"id": "1873", "title": "A phytography of WALDMEISTER", "abstract": "The architecture of the WALDMEISTER prover for unit equational deduction is based on a strict separation of active and passive facts. After an inspection of the system's proof procedure, the representation of each of the central data structures is outlined, namely indexing for the active facts, compression for the passive facts, successor sets for the hypotheses, and minimal recording of inference steps for the proof object. In order to cope with large search spaces, specialized redundancy criteria are employed, and the empirically gained control knowledge is integrated to ease the use of the system. The paper concludes with a quantitative comparison of the WALDMEISTER versions over the years, and a view of the future prospects", "keyphrases": ["WALDMEISTER", "theorem prover", "unit equational deduction", "passive facts", "active facts", "data structures", "indexing", "hypotheses", "phytography", "CADE ATP System Competition", "inference", "large search spaces", "redundancy", "future prospects"], "prmu": ["P", "M", "P", "P", "P", "P", "P", "P", "P", "M", "P", "P", "P", "P"]} +{"id": "1836", "title": "Parcel boundary identification with computer-assisted boundary overlay process for Taiwan", "abstract": "The study investigates the design of a process for parcel boundary identification with cadastral map overlay using the principle of least squares. The objective of this research is to provide an objective tool for boundary identification survey. The proposed process includes an adjustment model, a weighting scheme, and other related operations. A numerical example is included", "keyphrases": ["parcel boundary identification", "computer assisted boundary overlay process", "Taiwan", "cadastral map overlay", "objective tool", "boundary identification survey", "adjustment model", "weighting scheme", "Gauss-Marker model", "geographic information system", "weighted least squares adjustment"], "prmu": ["P", "M", "P", "P", "P", "P", "P", "P", "M", "U", "R"]} +{"id": "1758", "title": "Hilbert modular threefolds of arithmetic genus one", "abstract": "D. Weisser (1981) proved that there are exactly four Galois cubic number fields with Hilbert modular threefolds of arithmetic genus one. In this paper, we extend Weisser's work to cover all cubic number fields. Our main result is that there are exactly 33 fields with Hilbert modular threefolds of arithmetic genus one. These fields are enumerated explicitly", "keyphrases": ["Hilbert modular threefolds", "arithmetic genus one", "Galois cubic number fields"], "prmu": ["P", "P", "P"]} +{"id": "154", "title": "Verifying concurrent systems with symbolic execution", "abstract": "Current techniques for interactively proving temporal properties of concurrent systems translate transition systems into temporal formulas by introducing program counter variables. Proofs are not intuitive, because control flow is not explicitly considered. For sequential programs symbolic execution is a very intuitive, interactive proof strategy. In this paper we adopt this technique for parallel programs. Properties are formulated in interval temporal logic. An implementation in the interactive theorem prover KIV has shown that this technique offers a high degree of automation and allows simple, local invariants", "keyphrases": ["concurrent systems verification", "symbolic execution", "temporal properties", "concurrent systems", "transition systems", "temporal formulas", "program counter variables", "sequential programs", "parallel programs", "interactive theorem prover KIV", "local invariants"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1544", "title": "Driving the NKK Smartswitch.2. Graphics and text", "abstract": "Whether your message is one of workplace safety or world peace, the long nights of brooding over ways to tell the world are over. Part 1 described the basic interface to drive the Smartswitch. Part 2 adds the bells and whistles to allow both text and messages to be placed anywhere on the screen. It considers character generation, graphic generation and the user interface", "keyphrases": ["NKK Smartswitch", "computer graphics", "text", "messages", "character generation", "graphic generation", "user interface"], "prmu": ["R", "M", "P", "P", "P", "P", "P"]} +{"id": "1501", "title": "Computational challenges in cell simulation: a software engineering approach", "abstract": "Molecular biology's advent in the 20th century has exponentially increased our knowledge about the inner workings of life. We have dozens of completed genomes and an array of high-throughput methods to characterize gene encodings and gene product operation. The question now is how we will assemble the various pieces. In other words, given sufficient information about a living cell's molecular components, can we predict its behavior? We introduce the major classes of cellular processes relevant to modeling, discuss software engineering's role in cell simulation, and identify cell simulation requirements. Our E-Cell project aims to develop the theories, techniques, and software platforms necessary for whole-cell-scale modeling, simulation, and analysis. Since the project's launch in 1996, we have built a variety of cell models, and we are currently developing new models that vary with respect to species, target subsystem, and overall scale", "keyphrases": ["cell simulation", "software engineering", "object-oriented design", "molecular biology", "E-Cell project", "whole-cell-scale modeling"], "prmu": ["P", "P", "U", "P", "P", "P"]} +{"id": "1645", "title": "Effects of the transition to a client-centred team organization in administrative surveying work", "abstract": "A new work organization was introduced in administrative surveying work in Sweden during 1998. The new work organization implied a transition to a client-centred team-based organization and required a change in competence from specialist to generalist knowledge as well as a transition to a new information technology, implying a greater integration within the company. The aim of this study was to follow the surveyors for two years from the start of the transition and investigate how perceived consequences of the transition, job, organizational factors, well-being and effectiveness measures changed between 1998 and 2000. The Teamwork Profile and QPS Nordic questionnaire were used. The 205 surveyors who participated in all three study phases constituted the study group. The result showed that surveyors who perceived that they were working as generalists rated the improvements in job and organizational factors significantly higher than those who perceived that they were not yet generalists. Improvements were noted in 2000 in quality of service to clients, time available to handle a case and effectiveness of teamwork in a transfer to a team-based work organization group, cohesion and continuous improvement practices-for example, learning by doing, mentoring and guided delegation-were important to improve the social effectiveness of group work", "keyphrases": ["client-centred team organization", "administrative surveying work", "information technology", "company", "job", "organizational factors", "effectiveness measures", "Teamwork Profile", "QPS Nordic questionnaire", "social effectiveness", "public administrative sector"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1600", "title": "The development and evaluation of a fuzzy logic expert system for renal transplantation assignment: Is this a useful tool?", "abstract": "Allocating donor kidneys to patients is a complex, multicriteria decision-making problem which involves not only medical, but also ethical and political issues. In this paper, a fuzzy logic expert system approach was proposed as an innovative way to deal with the vagueness and complexity faced by medical doctors in kidney allocation decision making. A pilot fuzzy logic expert system for kidney allocation was developed and evaluated in comparison with two existing allocation algorithms: a priority sorting system used by multiple organ retrieval and exchange (MORE) in Canada and a point scoring systems used by united network for organ sharing (UNOS) in US. Our simulated experiment based on real data indicated that the fuzzy logic system can represent the expert's thinking well in handling complex tradeoffs, and overall, the fuzzy logic derived recommendations were more acceptable to the expert than those from the MORE and UNOS algorithms", "keyphrases": ["renal transplantation assignment", "fuzzy logic expert system", "donor kidneys", "multicriteria decision-making problem", "kidney allocation decision making", "priority sorting system", "multiple organ retrieval exchange", "point scoring systems", "united network for organ sharing", "simulated experiment", "complex tradeoff handling"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "R"]} +{"id": "1871", "title": "Strong and weak points of the MUSCADET theorem prover-examples from CASC-JC", "abstract": "MUSCADET is a knowledge-based theorem prover based on natural deduction. It has participated in CADE Automated theorem proving System Competitions. The results show its complementarity with regard to resolution-based provers. This paper presents some of its crucial methods and gives some examples of MUSCADET proofs from the last competition (CASC-JC in IJCAR 2001)", "keyphrases": ["MUSCADET", "CASC-JC", "knowledge-based theorem prover", "natural deduction", "CADE Automated theorem proving System Competitions", "resolution-based provers"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1834", "title": "A formal model of correctness in a cadastre", "abstract": "A key issue for cadastral systems is the maintenance of their correctness. Correctness is defined to be the proper correspondence between the valid legal situation and the content of the cadastre. This correspondence is generally difficult to achieve, since the cadastre is not a complete representation of all aspects influencing the legal situation in reality. The goal of the paper is to develop a formal model comprising representations of the cadastre and of reality that allows the simulation and investigation of cases where this correspondence is potentially violated. For this purpose the model consists of two parts, the first part represents the valid legal situation and the second part represents the cadastre. This makes it feasible to mark the differences between reality and the cadastre. The marking together with the two parts of the model facilitate the discussion of issues in \"real-world\" cadastral systems where incorrectness occurs. In order to develop a formal model, the paper uses the transfer of ownership of a parcel between two persons as minimal case study. The foundation for the formalization is a modern version of the situation calculus. The focus moves from the analysis of the cadastre to the preparation of a conceptual and a formalized model and the implementation of a prototype", "keyphrases": ["formal correctness model", "cadastre", "cadastral systems", "correctness maintenance", "legal situation", "formal model", "transfer of ownership", "minimal case study", "situation calculus", "formalized model"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1647", "title": "Examining children's reading performance and preference for different computer-displayed text", "abstract": "This study investigated how common online text affects reading performance of elementary school-age children by examining the actual and perceived readability of four computer-displayed typefaces at 12- and 14-point sizes. Twenty-seven children, ages 9 to 11, were asked to read eight children's passages and identify erroneous/substituted words while reading. Comic Sans MS, Arial and Times New Roman typefaces, regardless of size, were found to be more readable (as measured by a reading efficiency score) than Courier New. No differences in reading speed were found for any of the typeface combinations. In general, the 14-point size and the examined sans serif typefaces were perceived as being the easiest to read, fastest, most attractive, and most desirable for school-related material. In addition, participants significantly preferred Comic Sans MS and 14-point Arial to 12-point Courier. Recommendations for appropriate typeface combinations for children reading on computers are discussed", "keyphrases": ["child reading performance", "computer-displayed text", "online text", "elementary school-age children", "computer-displayed typefaces", "fonts", "user interface", "human factors", "educational computing"], "prmu": ["M", "P", "P", "P", "P", "U", "U", "U", "M"]} +{"id": "1602", "title": "An optimization approach to plan for reusable software components", "abstract": "It is well acknowledged in software engineering that there is a great potential for accomplishing significant productivity improvements through the implementation of a successful software reuse program. On the other hand, such gains are attainable only by instituting detailed action plans at both the organizational and program level. Given this need, the paucity of research papers related to planning, and in particular, optimized planning is surprising. This research, which is aimed at this gap, brings out an application of optimization for the planning of reusable software components (SCs). We present a model that selects a set of SCs that must be built, in order to lower development and adaptation costs. We also provide implications to project management based on simulation, an approach that has been adopted by other cost models in the software engineering literature. Such a prescriptive model does not exist in the literature", "keyphrases": ["software engineering", "productivity improvements", "software reuse program", "optimization", "action plans", "optimized planning", "reusable software components", "adaptation costs", "development costs", "project management", "simulation"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1929", "title": "Optimal time of switching between portfolios of securities", "abstract": "Optimal time of switching between several portfolios of securities are found for the purpose of profit maximization. Two methods of their determination are considered. The cases with three and n portfolios are studied in detail", "keyphrases": ["optimal time", "portfolios of securities", "profit maximization"], "prmu": ["P", "P", "P"]} +{"id": "156", "title": "Using extended logic programming for alarm-correlation in cellular phone networks", "abstract": "Alarm correlation is a necessity in large mobile phone networks, where the alarm bursts resulting from severe failures would otherwise overload the network operators. We describe how to realize alarm-correlation in cellular phone networks using extended logic programming. To this end, we describe an algorithm and system solving the problem, a model of a mobile phone network application, and a detailed solution for a specific scenario", "keyphrases": ["extended logic programming", "alarm-correlation", "cellular phone networks", "large mobile phone networks", "network operators", "fault diagnosis"], "prmu": ["P", "P", "P", "P", "P", "U"]} +{"id": "1546", "title": "Necessary conditions of optimality for impulsive systems on Banach spaces", "abstract": "We present necessary conditions of optimality for optimal control problems arising in systems governed by impulsive evolution equations on Banach spaces. Basic notations and terminologies are first presented and necessary conditions of optimality are presented. Special cases are discussed and we present an application to the classical linear quadratic regulator problem", "keyphrases": ["linear quadratic regulator", "optimality", "impulsive systems", "optimal control", "impulsive evolution equations", "Banach spaces", "necessary conditions"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1503", "title": "Neural networks for web content filtering", "abstract": "With the proliferation of harmful Internet content such as pornography, violence, and hate messages, effective content-filtering systems are essential. Many Web-filtering systems are commercially available, and potential users can download trial versions from the Internet. However, the techniques these systems use are insufficiently accurate and do not adapt well to the ever-changing Web. To solve this problem, we propose using artificial neural networks to classify Web pages during content filtering. We focus on blocking pornography because it is among the most prolific and harmful Web content. However, our general framework is adaptable for filtering other objectionable Web material", "keyphrases": ["artificial neural networks", "Intelligent Classification Engine", "learning capabilities", "pornographic/nonpornographic Web page differentiation", "Web content filtering", "violence", "Web page classification", "harmful Web content"], "prmu": ["P", "U", "U", "M", "P", "P", "M", "P"]} +{"id": "1687", "title": "Cleared for take-off [Hummingbird Enterprise]", "abstract": "A recent Gartner report identifies Hummingbird in the first wave of vendors as an early example of convergence in the 'smart enterprise suite' market. We spoke to Hummingbird's Marketing Director for Northern Europe", "keyphrases": ["smart enterprise suite", "Hummingbird Enterprise", "information content", "knowledge content", "collaboration"], "prmu": ["M", "P", "U", "U", "U"]} +{"id": "1914", "title": "Vacuum-compatible vibration isolation stack for an interferometric gravitational wave detector TAMA300", "abstract": "Interferometric gravitational wave detectors require a large degree of vibration isolation. For this purpose, a multilayer stack constructed of rubber and metal blocks is suitable, because it provides isolation in all degrees of freedom at once. In TAMA300, a 300 m interferometer in Japan, long-term dimensional stability and compatibility with an ultrahigh vacuum environment of about 10/sup -6/ Pa are also required. To keep the interferometer at its operating point despite ground strain and thermal drift of the isolation system, a thermal actuator was introduced. To prevent the high outgassing rate of the rubber from spoiling the vacuum, the rubber blocks were enclosed by gas-tight bellows. Using these techniques, we have successfully developed a three-layer stack which has a vibration isolation ratio of more than 10/sup 3/ at 300 Hz with control of drift and enough vacuum compatibility", "keyphrases": ["vibration isolation stack", "TAMA300 interferometer", "interferometric gravitational wave detectors", "rubber blocks", "multilayer stack", "metal blocks", "long-term dimensional stability", "ultrahigh vacuum environment", "operating point", "ground strain", "thermal drift", "thermal actuator", "gas-tight bellows", "rubber outgassing", "vacuum compatibility", "300 m", "10/sup -6/ Pa", "300 Hz"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1767", "title": "Bivariate fractal interpolation functions on rectangular domains", "abstract": "Non-tensor product bivariate fractal interpolation functions defined on gridded rectangular domains are constructed. Linear spaces consisting of these functions are introduced. The relevant Lagrange interpolation problem is discussed. A negative result about the existence of affine fractal interpolation functions defined on such domains is obtained", "keyphrases": ["bivariate fractal interpolation functions", "rectangular domains", "gridded rectangular domains", "linear spaces", "Lagrange interpolation problem", "affine fractal interpolation functions"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1809", "title": "Approach to adaptive neural net-based H/sub infinity / control design", "abstract": "An approach is investigated for the adaptive neural net-based H/sub infinity / control design of a class of nonlinear uncertain systems. In the proposed framework, two multilayer feedforward neural networks are constructed as an alternative to approximate the nonlinear system. The neural networks are piecewisely interpolated to generate a linear differential inclusion model by which a linear state feedback H/sub infinity / control law can be applied. An adaptive weight adjustment mechanism for the multilayer feedforward neural networks is developed to ensure H/sub infinity / regulation performance. It is shown that finding the control gain matrices can be transformed into a standard linear matrix inequality problem and solved via a developed recurrent neural network", "keyphrases": ["adaptive neural net-based H/sub infinity / control design", "nonlinear uncertain systems", "multilayer feedforward neural networks", "piecewise interpolation", "linear differential inclusion model", "linear state feedback", "control gain matrices", "linear matrix inequality problem", "recurrent neural network", "LMI"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "U"]} +{"id": "1466", "title": "Feldkamp-type image reconstruction from equiangular data", "abstract": "The cone-beam approach for image reconstruction attracts increasing attention in various applications, especially medical imaging. Previously, the traditional practical cone-beam reconstruction method, the Feldkamp algorithm, was generalized into the case of spiral/helical scanning loci with equispatial cone-beam projection data. In this paper, we formulated the generalized Feldkamp algorithm in the case of equiangular cone-beam projection data, and performed numerical simulation to evaluate the image quality. Because medical multi-slice/cone-beam CT scanners typically use equiangular projection data, our new formula may be useful in this area as a framework for further refinement and a benchmark for comparison", "keyphrases": ["Feldkamp-type image reconstruction", "equiangular data", "cone-beam approach", "medical imaging", "practical cone-beam reconstruction method", "spiral/helical scanning loci", "equispatial cone-beam projection data", "generalized Feldkamp algorithm", "equiangular cone-beam projection data", "numerical simulation", "image quality", "medical multi-slice/cone-beam CT scanners"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1895", "title": "An algorithm combining neural networks with fundamental parameters", "abstract": "An algorithm combining neural networks with the fundamental parameters equations (NNFP) is proposed for making corrections for non-linear matrix effects in x-ray fluorescence analysis. In the algorithm, neural networks were applied to relate the concentrations of components to both the measured intensities and the relative theoretical intensities calculated by the fundamental parameter equations. The NNFP algorithm is compared with the classical theoretical correction models, including the fundamental parameters approach, the Lachance-Traill model, a hyperbolic function model and the COLA algorithm. For an alloy system with 15 measured elements, in most cases, the prediction errors of the NNFP algorithm are lower than those of the fundamental parameters approach, the Lachance-Traill model, the hyperbolic function model and the COLA algorithm separately. If there are the serious matrix effects, such as matrix effects among Cr, Fe and Ni, the NNFP algorithm generally decreased predictive errors as compared with the classical models, except for the case of Cr by the fundamental parameters approach. The main reason why the NNFP algorithm has generally a better predictive ability than the classical theoretical correction models might be that neural networks can better calibrate the non-linear matrix effects in a complex multivariate system", "keyphrases": ["algorithm", "neural networks", "fundamental parameters", "fundamental parameters equations", "nonlinear matrix effects", "x-ray fluorescence analysis", "intensities", "NNFP algorithm", "theoretical correction models", "Lachance-Traill model", "hyperbolic function model", "COLA algorithm", "alloy system", "Cr", "Fe", "Ni", "complex multivariate system"], "prmu": ["P", "P", "P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1868", "title": "Estimation of an N-L-N Hammerstein-Wiener model", "abstract": "Estimation of a single-input single-output block-oriented model is studied. The model consists of a linear block embedded between two static nonlinear gains. Hence, it is called an N-L-N Hammerstein-Wiener model. First, the model structure is motivated and the disturbance model is discussed. The paper then concentrates on parameter estimation. A relaxation iteration scheme is proposed by making use of a model structure in which the error is bilinear-in-parameters. This leads to a simple algorithm which minimizes the original loss function. The convergence and consistency of the algorithm are studied. In order to reduce the variance error, the obtained linear model is further reduced using frequency weighted model reduction. A simulation study is used to illustrate the method", "keyphrases": ["N-L-N Hammerstein-Wiener model", "single-input single-output block-oriented model", "linear block", "static nonlinear gains", "model structure", "disturbance model", "parameter estimation", "relaxation iteration scheme", "bilinear-in-parameters error", "convergence", "consistency", "variance error", "frequency weighted model reduction", "nonlinear process"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "M"]} +{"id": "1706", "title": "Quantitative analysis of reconstructed 3-D coronary arterial tree and intracoronary devices", "abstract": "Traditional quantitative coronary angiography is performed on two-dimensional (2-D) projection views. These views are chosen by the angiographer to minimize vessel overlap and foreshortening. With 2-D projection views that are acquired in this nonstandardized fashion, however, there is no way to know or estimate how much error occurs in the QCA process. Furthermore, coronary arteries possess a curvilinear shape and undergo a cyclical deformation due to their attachment to the myocardium. Therefore, it is necessary to obtain three-dimensional (3-D) information to best describe and quantify the dynamic curvilinear nature of the human coronary artery. Using a patient-specific 3-D coronary reconstruction algorithm and routine angiographic images, a new technique is proposed to describe: (1) the curvilinear nature of 3-D coronary arteries and intracoronary devices; (2) the magnitude of the arterial deformation caused by intracoronary devices and due to heart motion; and (3) optimal view(s) with respect to the desired \"pathway\" for delivering intracoronary devices", "keyphrases": ["medical diagnostic imaging", "cyclical deformation", "myocardium", "dynamic curvilinear nature quantification", "patient-specific 3-D coronary reconstruction algorithm", "routine angiographic images", "arterial deformation magnitude", "intracoronary devices delivery pathway", "human coronary artery"], "prmu": ["M", "P", "P", "M", "P", "P", "R", "M", "P"]} +{"id": "1743", "title": "Adaptive stabilization of undamped flexible structures", "abstract": "In the paper non-identifier-based adaptive stabilization of undamped flexible structures is considered in the case of collocated input and output operators. The systems have poles and zeros on the imaginary axis. In the case where velocity feedback is available, the adaptive stabilizer is constructed by an adaptive PD-controller (proportional plus derivative controller). In the case where only position feedback is available, the adaptive stabilizer is constructed by an adaptive P-controller for the augmented system which consists of the controlled system and a parallel compensator. Numerical examples are given to illustrate the effectiveness of the proposed controllers", "keyphrases": ["adaptive stabilization", "undamped flexible structures", "poles and zeros", "imaginary axis", "velocity feedback", "adaptive PD-controller", "proportional plus derivative controller", "position feedback", "adaptive P-controller", "augmented system", "parallel compensator"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1855", "title": "Distribution software: ROI is king", "abstract": "Middle-market accounting software vendors are taking to the open road, by way of souped-up distribution suites that can track product as it wends its way from warehouse floor to customer site. Integration provides efficiencies, and cost savings", "keyphrases": ["accounting software", "warehouse management", "distribution"], "prmu": ["P", "M", "P"]} +{"id": "1810", "title": "Input-output based pole-placement controller for a class of time-delay systems", "abstract": "A controller structure valid for SISO plants involving both internal and external point delays is presented. The control signal is based only on the input and output plant signals. The controller allows finite or infinite spectrum assignment. The most important feature of the proposed controller is that it only involves the use of a class of point-delayed signals. Thus the controller synthesis involves less computational cost than former methods. Since the plant control input is generated by filtering the input and output plant signals, this controller structure is potentially applicable to the adaptive case of unknown plant parameters", "keyphrases": ["I/O-based pole-placement controller", "input-output based pole-placement controller", "time-delay systems", "SISO plants", "internal point delays", "and external point delays", "finite spectrum assignment", "infinite spectrum assignment", "point-delayed signals", "controller synthesis", "computational cost", "filtering"], "prmu": ["M", "P", "P", "P", "R", "P", "R", "P", "P", "P", "P", "P"]} +{"id": "1482", "title": "A parareal in time procedure for the control of partial differential equations", "abstract": "We have proposed in a previous note a time discretization for partial differential evolution equation that allows for parallel implementations. This scheme is here reinterpreted as a preconditioning procedure on an algebraic setting of the time discretization. This allows for extending the parallel methodology to the problem of optimal control for partial differential equations. We report a first numerical implementation that reveals a large interest", "keyphrases": ["time procedure", "partial differential equation control", "evolution equation", "preconditioning procedure", "Hilbert space", "algebraic setting", "time discretization", "optimal control"], "prmu": ["P", "R", "P", "P", "U", "P", "P", "P"]} +{"id": "1783", "title": "Becoming a chief librarian: an analysis of transition stages in academic library leadership", "abstract": "The author explores how the four-part model of transition cycles identified by Nicholson and West (1988) applies to becoming a chief librarian of an academic library. The four stages: preparation, encounter, adjustment, and stabilization, are considered from the micro-, mezzo-, and macrolevels of the organization, as well as for their psychological and social impact on the new job incumbent. An instrument for assessment of transitional success which could be administered in the adjustment or stabilization stage is considered", "keyphrases": ["chief librarian", "transition stages", "academic library leadership", "organization", "psychological impact", "social impact", "job", "transition cycles model"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "R"]} +{"id": "172", "title": "A VMEbus interface for multi-detector trigger and control system", "abstract": "MUSE (MUltiplicity SElector) is the trigger and control system of CHIMERA, a 4 pi charged particle detector. Initialization of MUSE can be performed via VMEbus. This paper describes the design of VMEbus interface and functional module in MUSE, and briefly discusses an application of MUSE", "keyphrases": ["VMEbus interface", "MUSE", "CHIMERA", "trigger system", "control system"], "prmu": ["P", "P", "P", "R", "P"]} +{"id": "1562", "title": "Solution of a class of two-dimensional integral equations", "abstract": "The two-dimensional integral equation 1/ pi integral integral /sub D/( phi (r, theta )/R/sup 2/)dS=f(r/sub 0/, theta /sub 0/) defined on a circular disk D: r/sub 0/or=5, n is not a multiple of 3 and (h, n)=1, where h is the class number of the filed Q( square root (-q)), then the diophantine equation x/sup 2/+q/sup 2k+1/=y/sup n/ has exactly two families of solutions (q, n, k, x, y)", "keyphrases": ["diophantine equation", "odd prime", "odd integer", "Lucas sequence", "primitive divisors"], "prmu": ["P", "P", "P", "U", "U"]} +{"id": "1713", "title": "A uniform framework for regulating service access and information release on the Web", "abstract": "The widespread use of Internet-based services is increasing the amount of information (such as user profiles) that clients are required to disclose. This information demand is necessary for regulating access to services, and functionally convenient (e.g., to support service customization), but it has raised privacy-related concerns which, if not addressed, may affect the users disposition to use network services. At the same time, servers need to regulate service access without disclosing entirely the details of their access control policy. There is therefore a pressing need for privacy-aware techniques to regulate access to services open to the network. We propose an approach for regulating service access and information disclosure on the Web. The approach consists of a uniform formal framework to formulate - and reason about - both service access and information disclosure constraints. It also provides a means for parties to communicate their requirements while ensuring that no private information be disclosed and that the communicated requirements are correct with respect to the constraints", "keyphrases": ["service access regulation", "information release", "WWW", "Internet", "user profiles", "information demand", "client server systems", "access control policy", "privacy-aware techniques", "network services", "information disclosure", "uniform formal framework", "reasoning"], "prmu": ["R", "P", "U", "U", "P", "P", "M", "P", "P", "P", "P", "P", "P"]} +{"id": "1925", "title": "On the accuracy of polynomial interpolation in Hilbert space with disturbed nodal values of the operator", "abstract": "The interpolation accuracy of polynomial operators in a Hilbert space with a measure is estimated when nodal values of these operators are given approximately", "keyphrases": ["polynomial interpolation", "Hilbert space", "disturbed nodal values", "polynomial operators"], "prmu": ["P", "P", "P", "P"]} +{"id": "1754", "title": "Coordination [crisis management]", "abstract": "Communications during a crisis, both internal and external, set the tone during response and carry a message through recovery. The authors describe how to set up a system for information coordination to make sure the right people get the right message, and the organization stays in control", "keyphrases": ["crisis management", "communications process", "information coordination"], "prmu": ["P", "M", "P"]} +{"id": "1711", "title": "Developing a CD-ROM as a teaching and learning tool in food and beverage management: a case study in hospitality education", "abstract": "Food and beverage management is the traditional core of hospitality education but, in its laboratory manifestation, has come under increasing pressure in recent years. It is an area that, arguably, presents the greatest challenges in adaptation to contemporary learning technologies but, at the same time, stands to benefit most from the potential of the Web. This paper addresses the design and development of a CD-ROM learning resource for food and beverage. It is a learning resource which is designed to integrate with rather than to replace existing conventional classroom and laboratory learning methods and, thus, compensate for the decline in the resource base faced in food and beverage education in recent years. The paper includes illustrative material drawn from the CD-ROM which demonstrates its use in teaching and learning", "keyphrases": ["food and beverage management", "hospitality education", "CD-ROM", "learning tool", "teaching tool"], "prmu": ["P", "P", "P", "P", "R"]} +{"id": "1882", "title": "Bandwidth vs. gains design of H/sub infinity / tracking controllers for current-fed induction motors", "abstract": "Describes a systematic procedure for designing speed and rotor flux norm tracking H/sub infinity /. controllers with unknown load torque disturbances for current-fed induction motors. A new effective design tool is developed to allow selection of the control gains so as to adjust the disturbances' rejection capability of the controllers in the face of the bandwidth requirements of the closed-loop system. Application of the proposed design procedure is demonstrated in a case study, and the results of numerical simulations illustrate the satisfactory performance achievable even in presence of rotor resistance uncertainty", "keyphrases": ["H/sub infinity / tracking controllers", "current-fed induction motors", "speed controllers", "rotor flux norm controllers", "unknown load torque disturbances", "design tool", "disturbances rejection capability", "bandwidth requirements", "closed-loop system", "feedback linearization", "observers"], "prmu": ["P", "P", "R", "R", "P", "P", "R", "P", "P", "U", "U"]} +{"id": "1548", "title": "A second order characteristic finite element scheme for convection-diffusion problems", "abstract": "A new characteristic finite element scheme is presented for convection-diffusion problems. It is of second order accuracy in time increment, symmetric, and unconditionally stable. Optimal error estimates are proved in the framework of L/sup 2/-theory. Numerical results are presented for two examples, which show the advantage of the scheme", "keyphrases": ["second order characteristic finite element scheme", "convection-diffusion problems", "second order accuracy", "optimal error estimates", "L/sup 2/ -theory"], "prmu": ["P", "P", "P", "P", "M"]} +{"id": "158", "title": "Neural and neuro-fuzzy integration in a knowledge-based system for air quality prediction", "abstract": "We propose a unified approach for integrating implicit and explicit knowledge in neurosymbolic systems as a combination of neural and neuro-fuzzy modules. In the developed hybrid system, a training data set is used for building neuro-fuzzy modules, and represents implicit domain knowledge. The explicit domain knowledge on the other hand is represented by fuzzy rules, which are directly mapped into equivalent neural structures. The aim of this approach is to improve the abilities of modular neural structures, which are based on incomplete learning data sets, since the knowledge acquired from human experts is taken into account for adapting the general neural architecture. Three methods to combine the explicit and implicit knowledge modules are proposed. The techniques used to extract fuzzy rules from neural implicit knowledge modules are described. These techniques improve the structure and the behavior of the entire system. The proposed methodology has been applied in the field of air quality prediction with very encouraging results. These experiments show that the method is worth further investigation", "keyphrases": ["neuro-fuzzy integration", "knowledge-based system", "air quality prediction", "neurosymbolic systems", "hybrid system", "training data set", "implicit domain knowledge representation", "fuzzy rules", "incomplete learning", "neural architecture", "experiments", "air pollution"], "prmu": ["P", "P", "P", "P", "P", "P", "M", "P", "P", "P", "P", "M"]} +{"id": "1649", "title": "Office essentials [stationery suppliers]", "abstract": "Make purchasing stationery a relatively simple task through effective planning and management of stock, and identifying the right supplier", "keyphrases": ["stationery suppliers", "purchasing", "planning", "management of stock"], "prmu": ["P", "P", "P", "P"]} +{"id": "1927", "title": "Optimal strategies for a semi-Markovian inventory system", "abstract": "Control for a semi-Markovian inventory system is considered. Under general assumptions on system functioning, conditions for existence of an optimal nonrandomized Markovian strategy are found. It is shown that under some additional assumptions on storing conditions for the inventory, the optimal strategy has a threshold (s, S)-frame", "keyphrases": ["optimal strategies", "semi-Markovian inventory system", "system functioning", "optimal nonrandomized Markovian strategy", "optimal strategy"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1631", "title": "Recovering lost efficiency of exponentiation algorithms on smart cards", "abstract": "At the RSA cryptosystem implementation stage, a major security concern is resistance against so-called side-channel attacks. Solutions are known but they increase the overall complexity by a non-negligible factor (typically, a protected RSA exponentiation is 133% slower). For the first time, protected solutions are proposed that do not penalise the running time of an exponentiation", "keyphrases": ["smart cards", "exponentiation algorithms", "RSA cryptosystem implementation stage", "security", "side-channel attack resistance", "public-key encryption"], "prmu": ["P", "P", "P", "P", "R", "U"]} +{"id": "1674", "title": "A column generation approach to delivery planning over time with inhomogeneous service providers and service interval constraints", "abstract": "We consider a problem of delivery planning over multiple time periods. Deliveries must be made to customers having nominated demand in each time period. Demand must be met in each time period by use of some combination of inhomogeneous service providers. Each service provider has a different delivery capacity, different cost of delivery to each customer, a different utilisation requirement, and different rules governing the spread of deliveries in time. The problem is to plan deliveries so as to minimise overall costs, subject to demand being met and service rules obeyed. A natural integer programming model was found to be intractable, except on problems with loose demand constraints, with gaps between best lower bound and best feasible solution of up to 35.1%, with an average of 15.4% over the test data set. In all but the problem with loosest demand constraints, Cplex 6.5 applied to this formulation failed to find the optimal solution before running out of memory. However a column generation approach improved the lower bound by between 0.6% and 21.9%, with an average of 9.9%, and in all cases found the optimal solution at the root node, without requiring branching", "keyphrases": ["column generation approach", "delivery planning over time", "inhomogeneous service providers", "service interval constraints", "delivery capacity", "lower bound", "transportation"], "prmu": ["P", "P", "P", "P", "P", "P", "U"]} +{"id": "1588", "title": "Contentment management", "abstract": "Andersen's William Yarker and Richard Young outline the route to a successful content management strategy", "keyphrases": ["Andersen Consulting", "content management strategy"], "prmu": ["M", "P"]} +{"id": "1530", "title": "Uniform supersaturated design and its construction", "abstract": "Supersaturated designs are factorial designs in which the number of main effects is greater than the number of experimental runs. In this paper, a discrete discrepancy is proposed as a measure of uniformity for supersaturated designs, and a lower bound of this discrepancy is obtained as,a benchmark of design uniformity. A construction method for uniform supersaturated designs via resolvable balanced incomplete block designs is also presented along with the investigation of properties of the resulting designs. The construction method shows a strong link between these two different kinds of designs", "keyphrases": ["uniform supersaturated design", "factorial designs", "experimental runs", "discrete discrepancy", "resolvable balanced incomplete block designs"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "165", "title": "Monitoring the news online", "abstract": "The author looks at how we can focus on what we want, finding small stories in vast oceans of news. There is no one tool that will scan every news resource available and give alerts on new available materials. Every one has a slightly different focus. Some are paid sources, while many are free. If used wisely, an excellent news monitoring system for a large number of topics can be set up for surprisingly little cost", "keyphrases": ["news monitoring", "online news", "Internet"], "prmu": ["P", "R", "U"]} +{"id": "1468", "title": "Developing Web-enhanced learning for information fluency-a liberal arts college's perspective", "abstract": "Learning is likely to take a new form in the twenty-first century, and a transformation is already in process. Under the framework of information fluency, efforts are being made at Rollins College to develop a Web-enhanced course that encompasses information literacy, basic computer literacy, and critical thinking skills. Computer-based education can be successful when librarians use technology effectively to enhance their integrated library teaching. In an online learning environment, students choose a time for learning that best suits their needs and motivational levels. They can learn at their own pace, take a nonlinear approach to the subject, and maintain constant communication with instructors and other students. The quality of a technology-facilitated course can be upheld if the educational objectives and methods for achieving those objectives are carefully planned and explored", "keyphrases": ["Web-enhanced learning", "information fluency", "liberal arts college", "information literacy", "computer literacy", "critical thinking skills", "computer-based education", "librarians", "integrated library teaching", "online learning"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1794", "title": "Well-posed anisotropic diffusion for image denoising", "abstract": "A nonlinear iterative smoothing filter based on a second-order partial differential equation is introduced. It smooths out the image according to an anisotropic diffusion process. The approach is based on a smooth approximation of the total variation (TV) functional which overcomes the non-differentiability of the TV functional at the origin. In particular, the authors perform linear smoothing over smooth areas but selective smoothing over candidate edges. By relating the smoothing parameter to the time step, they arrive at a CFL condition which guarantees the causality of the discrete scheme. This allows the adoption of higher time discretisation steps, while ensuring the absence of artefacts deriving from the non-smooth behaviour of the TV functional at the origin. In particular, it is shown that the proposed approach avoids the typical staircase effects in smooth areas which occur in the standard time-marching TV scheme", "keyphrases": ["image denoising", "well-posed anisotropic diffusion", "nonlinear iterative smoothing filter", "second-order partial differential equation", "total variation functional", "linear smoothing", "selective smoothing", "CFL condition", "discrete scheme", "causality", "higher time discretisation steps", "image restoration problem", "random Gaussian noise"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "M", "U"]} +{"id": "1769", "title": "Transformation rules and strategies for functional-logic programs", "abstract": "This paper abstracts the contents of a PhD dissertation entitled 'Transformation Rules and Strategies for Functional-Logic Programs' which has been recently defended. These techniques are based on fold/unfold transformations and they can be used to optimize integrated (functional-logic) programs for a wide class of applications. Experimental results show that typical examples in the field of artificial intelligence are successfully enhanced by our transformation system SYNTH. The thesis presents the first approach of these methods for declarative languages that integrate the best features from functional and logic programming", "keyphrases": ["program transformation rules", "functional-logic programs", "logic programming", "functional programming", "fold-unfold transformations", "experimental results", "artificial intelligence", "SYNTH", "declarative languages"], "prmu": ["R", "P", "P", "R", "M", "P", "P", "P", "P"]} +{"id": "1807", "title": "Regional flux target with minimum energy", "abstract": "An extension of a gradient controllability problem to the case where the target subregion is a part of the boundary of a parabolic system domain is discussed. A definition and some properties adapted to this case are presented. The focus is on the characterisation of the control achieving a regional boundary gradient target with minimum energy. An approach is developed that leads to a numerical algorithm for the computation of optimal control. Numerical illustrations show the efficiency of the approach and lead to conjectures", "keyphrases": ["regional flux target", "minimum energy", "gradient controllability problem", "target subregion", "parabolic system domain boundary", "regional boundary gradient target", "numerical algorithm", "optimal control"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "1495", "title": "Laptops zip to 2 GHz-plus", "abstract": "Intel's Pentium 4-M processor has reached the coveted 2-GHz mark, and speed-hungry mobile users will be tempted to buy a laptop with the chip. However, while our exclusive tests found 2-GHz P4-M notebooks among the fastest units we've tested, the new models failed to make dramatic gains compared with those based on Intel's 1.8-GHz mobile chip. Since 2-GHz notebooks carry a hefty price premium, buyers seeking both good performance and a good price might prefer a 1.8-GHz unit instead", "keyphrases": ["Intel Pentium 4-M processor", "mobile", "laptop", "notebooks", "2 GHz"], "prmu": ["R", "P", "P", "P", "M"]} +{"id": "1842", "title": "The role of B2B engines in B2B integration architectures", "abstract": "Semantic B2B integration architectures must enable enterprises to communicate standards-based B2B events like purchase orders with any potential trading partner. This requires not only back end application integration capabilities to integrate with e.g. enterprise resource planning (ERP) systems as the company-internal source and destination of B2B events, but also a capability to implement every necessary B2B protocol like electronic data interchange (EDI), RosettaNet as well as more generic capabilities like Web services (WS). This paper shows the placement and functionality of B2B engines in semantic B2B integration architectures that implement a generic framework for modeling and executing any B2B protocol. A detailed discussion shows how a B2B engine can provide the necessary abstractions to implement any standard-based B2B protocol or any trading partner specific specialization", "keyphrases": ["B2B engines", "semantic B2B integration architectures", "standards-based B2B event communication", "purchase orders", "trading partner", "ERP systems", "EDI", "RosettaNet", "Web services", "modeling"], "prmu": ["P", "P", "R", "P", "P", "R", "P", "P", "P", "P"]} +{"id": "1514", "title": "Universal parametrization in constructing smoothly-connected B-spline surfaces", "abstract": "In this paper, we explore the feasibility of universal parametrization in generating B-spline surfaces, which was proposed recently in the literature (Lim, 1999). We present an interesting property of the new parametrization that it guarantees Go continuity on B-spline surfaces when several independently constructed patches are put together without imposing any constraints. Also, a simple blending method of patchwork is proposed to construct C/sup n-1/ surfaces, where overlapping control nets are utilized. It takes into account the semi-localness property of universal parametrization. It effectively helps us construct very natural looking B-spline surfaces while keeping the deviation from given data points very low. Experimental results are shown with several sets of surface data points", "keyphrases": ["universal parametrization", "smoothly-connected B-spline surface generation", "G/sup 0/ continuity", "patches", "patchwork blending method", "C/sup n-1/ surfaces", "overlapping control nets", "semi-localness property", "surface data points"], "prmu": ["P", "R", "M", "P", "R", "P", "P", "P", "P"]} +{"id": "1551", "title": "The numerical solution of an evolution problem of second order in time on a closed smooth boundary", "abstract": "We consider an initial value problem for the second-order differential equation with a Dirichlet-to-Neumann operator coefficient. For the numerical solution we carry out semi-discretization by the Laguerre transformation with respect to the time variable. Then an infinite system of the stationary operator equations is obtained. By potential theory, the operator equations are reduced to boundary integral equations of the second kind with logarithmic or hypersingular kernels. The full discretization is realized by Nystrom's method which is based on the trigonometric quadrature rules. Numerical tests confirm the ability of the method to solve these types of nonstationary problems", "keyphrases": ["initial value problem", "second-order differential equation", "evolution problem", "closed smooth boundary", "Laguerre transformation", "hypersingular kernels", "boundary integral equations", "stationary operator equations"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1615", "title": "Laguerre approximation of fractional systems", "abstract": "Systems characterised by fractional power poles can be called fractional systems. Here, Laguerre orthogonal polynomials are employed to approximate fractional systems by minimum phase, reduced order, rational transfer functions. Both the time and the frequency-domain analysis exhibit the accuracy of the approximation", "keyphrases": ["Laguerre approximation", "fractional systems", "fractional power poles", "orthogonal polynomials", "minimum phase", "reduced order", "robust controllers", "closed-loop system", "rational transfer functions", "frequency-domain analysis", "time-domain analysis"], "prmu": ["P", "P", "P", "P", "P", "P", "U", "M", "P", "P", "M"]} +{"id": "1650", "title": "Low to mid-speed copiers [buyer's guide]", "abstract": "The low to mid-speed copier market is being transformed by the almost universal adoption of digital solutions. The days of the analogue copier are numbered as the remaining vendors plan to withdraw from this sector by 2005. Reflecting the growing market for digital, vendors are reducing prices, making a digital solution much more affordable. The battle for the copier market is intense, and the popularity of the multifunctional device is going to transform the office equipment market. As total cost of ownership becomes increasingly important and as budgets are squeezed, the most cost-effective solutions are those that will survive this shake-down", "keyphrases": ["low to mid-speed copier market", "total cost of ownership"], "prmu": ["P", "P"]} +{"id": "1823", "title": "Single-phase shunt active power filter with harmonic detection", "abstract": "An advanced active power filter for the compensation of instantaneous harmonic current components in nonlinear current loads is presented. A signal processing technique using an adaptive neural network algorithm is applied for the detection of harmonic components generated by nonlinear current loads and it can efficiently determine the instantaneous harmonic components in real time. The validity of this active filtering processing system to compensate current harmonics is substantiated by simulation results", "keyphrases": ["single-phase shunt active power filter", "harmonic detection", "instantaneous harmonic current components compensation", "nonlinear current loads", "signal processing technique", "adaptive neural network algorithm", "instantaneous harmonic components", "simulation"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P"]} +{"id": "1866", "title": "Tracking with sensor failures", "abstract": "Studies the reliability with sensor failures of the asymptotic tracking problem for linear time invariant systems using the factorization approach. The plant is two-output and the compensator is two-degree-of-freedom. Necessary and sufficient conditions are presented for the general problem and a simple solution is given for problems with stable plants", "keyphrases": ["sensor failures", "reliability", "asymptotic tracking problem", "linear time invariant systems", "factorization approach", "two-output plant", "two-degree-of-freedom compensator", "necessary and sufficient conditions"], "prmu": ["P", "P", "P", "P", "P", "R", "R", "P"]} +{"id": "1708", "title": "A study of hospitality and tourism information technology education and industrial applications", "abstract": "The purpose of this study was to examine the subject relevance of information technology (IT) in hospitality and tourism management programs with skills deployed in the workplace. This study aimed at investigating graduates' transition from education to employment, and to determine how well they appear to be equipped to meet the needs of the hospitality and tourism industry. One hundred and seventeen graduates responded to a mail survey. These graduates rated the importance of IT skills in the workplace, the level of IT teaching in hotel and tourism management programs, and the self-competence level in IT. This study concluded that a gap exists between the IT skills required at work and those acquired at university", "keyphrases": ["hospitality and tourism management programs", "education", "employment", "hospitality industry", "tourism industry", "mail survey", "graduates", "IT skills", "university", "IT teaching"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1471", "title": "E-commerce-resources for doing business on the Internet", "abstract": "There are many different types of e-commerce depending upon who or what is selling and who or what is buying. In addition, e-commerce is more than an exchange of funds and goods or services, it encompasses an entire infrastructure of services, computer hardware and software products, technologies, and communications formats. The paper discusses e-commerce terminology, types and information resources, including books and Web sites", "keyphrases": ["business", "Internet", "e-commerce", "terminology", "information resources", "books", "Web sites"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1770", "title": "New developments in inductive learning", "abstract": "Any intelligent system, whether natural or artificial, must have three characteristics: knowledge, reasoning, and learning. Artificial intelligence (AI) studies these three aspects in artificial systems. Briefly, we could say that knowledge refers to the system's world model, and reasoning to the manipulation of this knowledge. Learning is slightly more complex; the system interacts with the world and as a consequence it builds onto and modifies its knowledge. This process of self-building and self-modifying is known as learning. This thesis is set within the field of artificial intelligence and focuses on learning. More specifically, it deals with the inductive learning of decision trees", "keyphrases": ["inductive learning", "new developments", "intelligent system", "knowledge", "reasoning", "artificial intelligence", "decision trees"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1735", "title": "Mid-market accounting systems", "abstract": "Welcome to our fourth annual survey of accounting systems and enterprise resource planning (ERP) systems. Last September, we concentrated on financial and distribution systems for medium-sized businesses (mid market) and included 22 products in our charts. This year, we extended the products to include manufacturing and added 34 products to the list", "keyphrases": ["mid-market accounting systems", "survey", "enterprise resource planning", "manufacturing"], "prmu": ["P", "P", "P", "P"]} +{"id": "181", "title": "Electromagnetics computations using the MPI parallel implementation of the steepest descent fast multipole method (SDFMM)", "abstract": "The computational solution of large-scale linear systems of equations necessitates the use of fast algorithms but is also greatly enhanced by employing parallelization techniques. The objective of this work is to demonstrate the speedup achieved by the MPI (message passing interface) parallel implementation of the steepest descent fast multipole method (SDFMM). Although this algorithm has already been optimized to take advantage of the structure of the physics of scattering problems, there is still the opportunity to speed up the calculation by dividing tasks into components using multiple processors and solve them in parallel. The SDFMM has three bottlenecks ordered as (1) filling the sparse impedance matrix associated with the near-field method of moments interactions (MoM), (2) the matrix vector multiplications associated with this sparse matrix and (3) the far field interactions associated with the fast multipole method. The parallel implementation task is accomplished using a thirty-one node Intel Pentium Beowulf cluster and is also validated on a 4-processor Alpha workstation. The Beowulf cluster consists of thirty-one nodes of 350 MHz Intel Pentium IIs with 256 MB of RAM and one node of a 4*450 MHz Intel Pentium II Xeon shared memory processor with 2 GB of RAM with all nodes connected to a 100 BaseTX Ethernet network. The Alpha workstation has a maximum of four 667 MHz processors. Our numerical results show significant linear speedup in filling the sparse impedance matrix. Using the 32-processors on the Beowulf cluster lead to a 7.2 overall speedup while a 2.5 overall speedup is gained using the 4-processors on the Alpha workstation", "keyphrases": ["electromagnetics computations", "MPI parallel implementation", "steepest descent fast multipole method", "large-scale linear systems", "fast algorithms", "message passing interface", "physics", "multiple processors", "sparse impedance matrix", "near-field MoM", "method of moments", "scattering problems", "matrix vector multiplications", "Intel Pentium Beowulf cluster", "4-processor Alpha workstation", "Intel Pentium II", "RAM", "Xeon shared memory processor", "100 BaseTX Ethernet network", "scattered electric field", "scattered magnetic field", "350 MHz", "256 MByte", "450 MHz", "2 GByte", "667 MHz"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M", "P", "M", "P", "M", "P"]} +{"id": "1903", "title": "The BLISS programming language: a history", "abstract": "The BLISS programming language was invented by William A. Wulf and others at Carnegie-Mellon University in 1969, originally for the DEC PDP-10. BLISS-10 caught the interest of Ronald F. Brender of DEC (Digital Equipment Corporation). After several years of collaboration, including the creation of BLISS-11 for the PDP-11, BLISS was adopted as DEC's implementation language for use on its new line of VAX computers in 1975. DEC developed a completely new generation of BLISSs for the VAX, PDP-10 and PDP-11, which became widely used at DEC during the 1970s and 1980s. With the creation of the Alpha architecture in the early 1990s, BLISS was extended again, in both 32- and 64-bit flavors. BLISS support for the Intel IA-32 architecture was introduced in 1995 and IA-64 support is now in progress. BLISS has a number of unusual characteristics: it is typeless, requires use of an explicit contents of operator (written as a period or 'dot'), takes an algorithmic approach to data structure definition, has no goto, is an expression language, and has an unusually rich compile-time language. This paper reviews the evolution and use of BLISS over its three decade lifetime. Emphasis is on how the language evolved to facilitate portable programming while retaining its initial highly machine-specific character. Finally, the success of its characteristics are assessed", "keyphrases": ["BLISS programming language", "machine-oriented language", "portable programming", "system implementation language", "data structure definition", "compile-time language"], "prmu": ["P", "M", "P", "M", "P", "P"]} +{"id": "1591", "title": "Quadratic interpolation on spheres", "abstract": "Riemannian quadratics are C/sup 1/ curves on Riemannian manifolds, obtained by performing the quadratic recursive deCastlejeau algorithm in a Riemannian setting. They are of interest for interpolation problems in Riemannian manifolds, such as trajectory-planning for rigid body motion. Some interpolation properties of Riemannian quadratics are analysed when the ambient manifold is a sphere or projective space, with the usual Riemannian metrics", "keyphrases": ["quadratic interpolation", "Riemannian manifolds", "trajectory-planning", "rigid body motion", "ambient manifold", "corner-cutting", "parallel translation", "approximation theory"], "prmu": ["P", "P", "P", "P", "P", "U", "U", "U"]} +{"id": "1628", "title": "Quasi-Newton algorithm for adaptive minor component extraction", "abstract": "An adaptive quasi-Newton algorithm is first developed to extract a single minor component corresponding to the smallest eigenvalue of a stationary sample covariance matrix. A deflation technique instead of the commonly used inflation method is then applied to extract the higher-order minor components. The algorithm enjoys the advantage of having a simpler computational complexity and a highly modular and parallel structure for efficient implementation. Simulation results are given to demonstrate the effectiveness of the proposed algorithm for extracting multiple minor components adaptively", "keyphrases": ["quasi-Newton algorithm", "adaptive minor component extraction", "eigenvalue", "stationary sample covariance matrix", "deflation technique", "higher-order minor components", "computational complexity", "modular structure", "parallel structure", "simulation results", "adaptive estimation", "DOA estimation", "ROOT-MUSIC estimator"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "M", "U", "U"]} +{"id": "1529", "title": "Quantized-State Systems: A DEVS-approach for continuous system simulation", "abstract": "A new class of dynamical systems, Quantized State Systems or QSS, is introduced in this paper. QSS are continuous time systems where the input trajectories are piecewise constant functions and the state variable trajectories - being themselves piecewise linear functions - are converted into piecewise constant functions via a quantization function equipped with hysteresis. It is shown that QSS can be exactly represented and simulated by a discrete event model, within the framework of the DEVS-approach. Further, it is shown that QSS can be used to approximate continuous systems, thus allowing their discrete-event simulation in opposition to the classical discrete-time simulation. It is also shown that in an approximating QSS, some stability properties of the original system are conserved and the solutions of the QSS go to the solutions of the original system when the quantization goes to zero", "keyphrases": ["dynamical systems", "Quantized State Systems", "continuous time systems", "piecewise constant functions", "discrete event model", "discrete-event simulation"], "prmu": ["P", "P", "P", "P", "P", "P"]} +{"id": "1827", "title": "Gossip is synteny: Incomplete gossip and the syntenic distance between genomes", "abstract": "The syntenic distance between two genomes is given by the minimum number of fusions, fissions, and translocations required to transform one into the other, ignoring the order of genes within chromosomes. Computing this distance is NP-hard. In the present work, we give a tight connection between syntenic distance and the incomplete gossip problem, a novel generalization of the classical gossip problem. In this problem, there are n gossipers, each with a unique piece of initial information; they communicate by phone calls in which the two participants exchange all their information. The goal is to minimize the total number of phone calls necessary to inform each gossiper of his set of relevant gossip which he desires to learn. As an application of the connection between syntenic distance and incomplete gossip, we derive an O(2/sup O(n log n)/) algorithm to exactly compute the syntenic distance between two genomes with at most n chromosomes each. Our algorithm requires O(n/sup 2/+2/sup O(d log d)/) time when this distance is d, improving the O(n/sup 2/+2(O(d//sup 2/))) running time of the best previous exact algorithm", "keyphrases": ["syntenic distance", "genomes", "NP-hard", "incomplete gossip problem", "comparative genomics", "running time", "chromosomes"], "prmu": ["P", "P", "P", "P", "M", "P", "P"]} +{"id": "1862", "title": "Global comparison of stages of growth based on critical success factors", "abstract": "With increasing globalization of business, the management of IT in international organizations is faced with the complex task of dealing with the difference between local and international IT needs. This study evaluates, and compares, the level of IT maturity and the critical success factors (CSFs) in selected geographic regions, namely, Norway, Australia/New Zealand, North America, Europe, Asia/Pacific, and India. The results show that significant differences in the IT management needs in these geographic regions exist, and that the IT management operating in these regions must balance the multiple critical success factors for achieving an optimal local-global mix for business success", "keyphrases": ["business globalization", "IT management", "international IT needs", "local IT needs", "IT maturity", "critical success factors", "Norway", "Australia", "New Zealand", "North America", "Europe", "Asia/Pacific", "India", "optimal local-global mix", "business success"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "U", "M", "P", "P", "P", "P", "P", "P"]} +{"id": "1749", "title": "Advanced aerostatic stability analysis of cable-stayed bridges using finite-element method", "abstract": "Based on the concept of limit point instability, an advanced nonlinear finite-element method that can be used to analyze the aerostatic stability of cable-stayed bridges is proposed. Both geometric nonlinearity and three components of wind loads are considered in this method. The example bridge is the second Santou Bay cable-stayed bridge with a main span length of 518 m built in China. Aerostatic stability of the example bridge is investigated using linear and proposed methods. The effect of pitch moment coefficient on the aerostatic stability of the bridge has been studied. The results show that the aerostatic instability analyses of cable-stayed bridges based on the linear method considerably overestimate the wind-resisting capacity of cable-stayed bridges. The proposed method is highly accurate and efficient. Pitch moment coefficient has a major effect on the aerostatic stability of cable-stayed bridges. Finally, the aerostatic failure mechanism of cable-stayed bridges is explained by tracing the aerostatic instability path", "keyphrases": ["limit point instability", "advanced nonlinear finite element method", "advanced aerostatic stability analysis", "cable-stayed bridges", "geometric nonlinearity", "wind loads", "Santou Bay cable-stayed bridge", "China", "pitch moment coefficient", "aerostatic failure mechanism"], "prmu": ["P", "M", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1611", "title": "Data mining business intelligence for competitive advantage", "abstract": "Organizations have lately realized that just processing transactions and/or information faster and more efficiently no longer provides them with a competitive advantage vis-a-vis their competitors for achieving business excellence. Information technology (IT) tools that are oriented towards knowledge processing can provide the edge that organizations need to survive and thrive in the current era of fierce competition. Enterprises are no longer satisfied with business information system(s); they require business intelligence system(s). The increasing competitive pressures and the desire to leverage information technology techniques have led many organizations to explore the benefits of new emerging technology, data warehousing and data mining. The paper discusses data warehouses and data mining tools and applications", "keyphrases": ["business intelligence", "competitive advantage", "organizations", "information technology", "knowledge processing", "business information system", "data warehouses", "data mining"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1654", "title": "Numerical validation of solutions of complementarity problems: the nonlinear case", "abstract": "This paper proposes a validation method for solutions of nonlinear complementarity problems. The validation procedure performs a computational test. If the result of the test is positive, then it is guaranteed that a given multi-dimensional interval either includes a solution or excludes all solutions of the nonlinear complementarity problem", "keyphrases": ["numerical validation", "computational test", "nonlinear complementarity problem", "optimization"], "prmu": ["P", "P", "P", "U"]} +{"id": "17", "title": "Fault diagnosis and fault tolerant control of linear stochastic systems with unknown inputs", "abstract": "This paper presents an integrated robust fault detection and isolation (FDI) and fault tolerant control (FTC) scheme for a fault in actuators or sensors of linear stochastic systems subjected to unknown inputs (disturbances). As usual in this kind of works, it is assumed that single fault occurs at a time and the fault treated is of random bias type. The FDI module is constructed using banks of robust two-stage Kalman filters, which simultaneously estimate the state and the fault bias, and generate residual sets decoupled from unknown disturbances. All elements of residual sets are evaluated by using a hypothesis statistical test, and the fault is declared according to the prepared decision logic. The FTC module is activated based on the fault indicator, and additive compensation signal is computed using the fault bias estimate and combined to the nominal control law for compensating the fault's effect on the system. Simulation results for the simplified longitudinal flight control system with parameter variations, process and measurement noises demonstrate the effectiveness of the approach proposed", "keyphrases": ["fault detection", "fault isolation", "fault tolerant control", "linear systems", "stochastic systems", "two-stage Kalman filters", "state estimation", "longitudinal flight control system", "robust control", "discrete-time system"], "prmu": ["P", "R", "P", "R", "P", "P", "R", "P", "R", "M"]} +{"id": "1510", "title": "Estimation of the gradient of the solution of an adjoint diffusion equation by the Monte Carlo method", "abstract": "For the case of isotropic diffusion we consider the representation of the weighted concentration of trajectories and its space derivatives in the form of integrals (with some weights) of the solution to the corresponding boundary value problem and its directional derivative of a convective velocity. If the convective velocity at the domain boundary is degenerate and some other additional conditions are imposed this representation allows us to construct an efficient 'random walk by spheres and balls' algorithm. When these conditions are violated, transition to modelling the diffusion trajectories by the Euler scheme is realized, and the directional derivative of velocity is estimated by the dependent testing method, using the parallel modelling of two closely-spaced diffusion trajectories. We succeeded in justifying this method by statistically equivalent transition to modelling a single trajectory after the first step in the Euler scheme, using a suitable weight. This weight also admits direct differentiation with respect to the initial coordinate along a given direction. The resulting weight algorithm for calculating concentration derivatives is especially efficient if the initial point is in the subdomain in which the coefficients of the diffusion equation are constant", "keyphrases": ["isotropic diffusion", "weighted trajectory concentration", "space derivatives", "integrals", "boundary value problem", "directional derivative", "convective velocity", "domain boundary", "gradient estimation", "adjoint diffusion equation", "Monte Carlo method", "random walk by spheres and balls algorithm", "diffusion trajectories", "Euler scheme", "dependent testing method", "parallel modelling", "closely-spaced diffusion trajectories", "statistically equivalent transition", "weight", "direct differentiation", "initial coordinate", "concentration derivatives"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "R", "P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1555", "title": "A note on multi-index polynomials of Dickson type and their applications in quantum optics", "abstract": "We discuss the properties of a new family of multi-index Lucas type polynomials, which are often encountered in problems of intracavity photon statistics. We develop an approach based on the integral representation method and show that this class of polynomials can be derived from recently introduced multi-index Hermite like polynomials", "keyphrases": ["Lucas type polynomials", "multi-index polynomials", "quantum optics", "intracavity photon statistics", "integral representation", "generating functions"], "prmu": ["P", "P", "P", "P", "P", "U"]} +{"id": "1694", "title": "Product development: using a 3D computer model to optimize the stability of the Rocket TM powered wheelchair", "abstract": "A three-dimensional (3D) lumped-parameter model of a powered wheelchair was created to aid the development of the Rocket prototype wheelchair and to help explore the effect of innovative design features on its stability. The model was developed using simulation software, specifically Working Model 3D. The accuracy of the model was determined by comparing both its static stability angles and dynamic behavior as it passed down a 4.8-cm (1.9\") road curb at a heading of 45 degrees with the performance of the actual wheelchair. The model's predictions of the static stability angles in the forward, rearward, and lateral directions were within 9.3, 7.1, and 3.8% of the measured values, respectively. The average absolute error in the predicted position of the wheelchair as it moved down the curb was 2.2 cm/m (0.9\" per 3'3\") traveled. The accuracy was limited by the inability to model soft bodies, the inherent difficulties in modeling a statically indeterminate system, and the computing time. Nevertheless, it was found to be useful in investigating the effect of eight design alterations on the lateral stability of the wheelchair. Stability was quantified by determining the static lateral stability angles and the maximum height of a road curb over which the wheelchair could successfully drive on a diagonal heading. The model predicted that the stability was more dependent on the configuration of the suspension system than on the dimensions and weight distribution of the wheelchair. Furthermore, for the situations and design alterations studied, predicted improvements in static stability were not correlated with improvements in dynamic stability", "keyphrases": ["3D computer model", "product development", "innovative design features", "suspension system configuration", "dynamic stability improvements", "average absolute error", "predicted position", "soft bodies modeling", "statically indeterminate system", "computing time", "design alterations effect", "diagonal heading", "weight distribution", "Rocket TM powered wheelchair", "4.8 cm"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "R", "P", "P", "R", "P", "P", "P", "U"]} +{"id": "1568", "title": "Natural language from artificial life", "abstract": "This article aims to show that linguistics, in particular the study of the lexico-syntactic aspects of language, provides fertile ground for artificial life modeling. A survey of the models that have been developed over the last decade and a half is presented to demonstrate that ALife techniques have a lot to offer an explanatory theory of language. It is argued that this is because much of the structure of language is determined by the interaction of three complex adaptive systems: learning, culture, and biological evolution. Computational simulation, informed by theoretical linguistics, is an appropriate response to the challenge of explaining real linguistic data in terms of the processes that underpin human language", "keyphrases": ["natural language", "linguistics", "lexico-syntactic aspects", "ALife", "adaptive systems", "learning", "culture", "biological evolution", "computational simulation", "artificial life"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "178", "title": "A parallelized indexing method for large-scale case-based reasoning", "abstract": "Case-based reasoning (CBR) is a problem solving methodology commonly seen in artificial intelligence. It can correctly take advantage of the situations and methods in former cases to find out suitable solutions for new problems. CBR must accurately retrieve similar prior cases for getting a good performance. In the past, many researchers proposed useful technologies to handle this problem. However, the performance of retrieving similar cases may be greatly influenced by the number of cases. In this paper, the performance issue of large-scale CBR is discussed and a parallelized indexing architecture is then proposed for efficiently retrieving similar cases in large-scale CBR. Several algorithms for implementing the proposed architecture are also described. Some experiments are made and the results show the efficiency of the proposed method", "keyphrases": ["parallelized indexing method", "large-scale case-based reasoning", "problem solving methodology", "artificial intelligence", "bitwise indexing", "similar prior case retrieval", "performance", "experiments"], "prmu": ["P", "P", "P", "P", "M", "R", "P", "P"]} +{"id": "185", "title": "Property testers for dense Constraint Satisfaction programs on finite domains", "abstract": "Many NP-hard languages can be \"decided\" in subexponential time if the definition of \"decide\" is relaxed only slightly. Rubinfeld and Sudan introduced the notion of property testers, probabilistic algorithms that can decide, with high probability, if a function has a certain property or if it is far from any function having this property. Goldreich, Goldwasser, and Ron constructed property testers with constant query complexity for dense instances of a large class of graph problems. Since many graph problems can be viewed as special cases of the Constraint Satisfaction Problem on Boolean domains, it is natural to try to construct property testers for more general cases of the Constraint Satisfaction Problem. In this paper, we give explicit constructions of property testers using a constant number of queries for dense instances of Constraint Satisfaction Problems where the constraints have constant arity and the variables assume values in some domain of finite size", "keyphrases": ["NP-hard languages", "property testers", "probabilistic algorithms", "constant query complexity", "constraint satisfaction", "dense instances", "randomized sampling", "subexponential time", "graph problems", "Constraint Satisfaction Problem"], "prmu": ["P", "P", "P", "P", "P", "P", "U", "P", "P", "P"]} +{"id": "1907", "title": "Multiple comparison methods for means", "abstract": "Multiple comparison methods (MCMs) are used to investigate differences between pairs of population means or, more generally, between subsets of population means using sample data. Although several such methods are commonly available in statistical software packages, users may be poorly informed about the appropriate method(s) to use and/or the correct way to interpret the results. This paper classifies the MCMs and presents the important methods for each class. Both simulated and real data are used to compare the methods, and emphasis is placed on a correct application and interpretation. We include suggestions for choosing the best method. Mathematica programs developed by the authors are used to compare MCMs. By taking the advantage of Mathematica's notebook structure, all interested student can use these programs to explore the subject more deeply", "keyphrases": ["multiple comparison procedures", "population means", "error rate", "single-step procedures", "step-down procedures", "sales management", "pack-age design"], "prmu": ["M", "P", "U", "U", "U", "U", "U"]} +{"id": "1595", "title": "Convergence of finite element approximations and multilevel linearization for Ginzburg-Landau model of d-wave superconductors", "abstract": "In this paper, we consider the finite element approximations of a recently proposed Ginzburg-Landau-type model for d-wave superconductors. In contrast to the conventional Ginzburg-Landau model the scalar complex valued order-parameter is replaced by a multicomponent complex order-parameter and the free energy is modified according to the d-wave paring symmetry. Convergence and optimal error estimates and some super-convergent estimates for the derivatives are derived. Furthermore, we propose a multilevel linearization procedure to solve the nonlinear systems. It is proved that the optimal error estimates and super-convergence for the derivatives are preserved by the multi-level linearization algorithm", "keyphrases": ["Ginzburg-Landau model", "d-wave", "superconductivity", "finite element method", "nonlinear systems", "error estimation", "two-grid method", "free energy", "multilevel linearization"], "prmu": ["P", "P", "U", "M", "P", "P", "U", "P", "P"]} +{"id": "1669", "title": "Supply chain optimisation in the paper industry", "abstract": "We describe the formulation and development of a supply-chain optimisation model for Fletcher Challenge Paper Australasia (FCPA). This model, known as the paper industry value optimisation tool (PIVOT), is a large mixed integer program that finds an optimal allocation of supplier to mill, product to paper machine, and paper machine to customer, while at the same time modelling many of the supply chain details and nuances which are peculiar to FCPA. PIVOT has assisted FCPA in solving a number of strategic and tactical decision problems, and provided significant economic benefits for the company", "keyphrases": ["supply chain optimisation", "Fletcher Challenge Paper Australasia", "paper industry value optimisation tool", "PIVOT", "large mixed integer program", "optimal allocation", "strategic decision problems", "tactical decision problems", "economic benefits"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1488", "title": "Social presence in telemedicine", "abstract": "We studied consultations between a doctor, emergency nurse practitioners (ENPs) and their patients in a minor accident and treatment service (MATS). In the conventional consultations, all three people were located at the main hospital. In the teleconsultations, the doctor was located in a hospital 6 km away from the MATS and used a videoconferencing link connected at 384 kbit/s. There were 30 patients in the conventional group and 30 in the telemedical group. The presenting problems were similar in the two groups. The mean duration of teleconsultations was 951 s and the mean duration of face-to-face consultations was 247 s. In doctor-nurse communication there was a higher rate of turn taking in teleconsultations than in face-to-face consultations; there were also more interruptions, more words and more `backchannels' (e.g. `mhm', `uh-huh') per teleconsultation. In doctor-patient communication there was a higher rate of turn taking, more words, more interruptions and more backchannels per teleconsultation. In patient-nurse communication there was. relatively little difference between the two modes of consulting the doctor. Telemedicine appeared to empower the patient to ask more questions of the doctor. It also seemed that the doctor took greater care in a teleconsultation to achieve coordination of beliefs with the patient than in a face-to-face consultation", "keyphrases": ["social presence", "telemedicine", "doctor", "emergency nurse practitioners", "patients", "minor accident and treatment service", "teleconsultations", "videoconferencing link", "face-to-face consultations", "doctor-nurse communication", "interruptions", "backchannels", "words", "turn taking", "patient-nurse communication", "belief coordination", "384 kbit/s", "951 s", "247 s"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "R"]} +{"id": "1774", "title": "A work journal [librarianship]", "abstract": "Keeping a work journal can be useful in exploring one's thoughts and feelings about work challenges and work decisions. It can help bring about greater fulfillment in one's work life by facilitating self-renewal, change, the search for new meaning, and job satisfaction. One example of a work journal which I kept in 1998 is considered. It touches on several issues of potential interest to midlife career librarians including the challenge of technology, returning to work at midlife after raising a family, further education, professional writing, and job exchange", "keyphrases": ["work decisions", "work challenges", "job satisfaction", "self-renewal", "work journal", "change", "midlife career librarians", "technology", "further education", "professional writing", "job exchange"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1731", "title": "Hit the road, Jack", "abstract": "Going freelance offers the potential of higher earnings, variety and independence - but also removes the benefits of permanent employment and can mean long distance travel and periods out of work. The author looks at the benefits and drawbacks - and how to get started as an IT contractor", "keyphrases": ["IT contractor", "freelance working"], "prmu": ["P", "R"]} +{"id": "1789", "title": "Dousing terrorist funding: mission possible? [banks]", "abstract": "The government is tightening its grip on terrorist money flows. But as the banking industry continues to expand its Patriot Act compliance activities, it is with the realization that a great deal of work remains to be done before the American financial system can become truly airtight. Identification instruments, especially drivers licenses, represent a significant weak spot", "keyphrases": ["banking", "Patriot Act", "terrorist funding", "identification"], "prmu": ["P", "P", "P", "P"]} +{"id": "1475", "title": "Relation between glare and driving performance", "abstract": "The present study investigated the effects of discomfort glare on driving behavior. Participants (old and young; US and Europeans) were exposed to a simulated low- beam light source mounted on the hood of an instrumented vehicle. Participants drove at night in actual traffic along a track consisting of urban, rural, and highway stretches. The results show that the relatively low glare source caused a significant drop in detecting simulated pedestrians along the roadside and made participants drive significantly slower on dark and winding roads. Older participants showed the largest drop in pedestrian detection performance and reduced their driving speed the most. The results indicate that the de Boer rating scale, the most commonly used rating scale for discomfort glare, is practically useless as a predictor of driving performance. Furthermore, the maximum US headlamp intensity (1380 cd per headlamp) appears to be an acceptable upper limit", "keyphrases": ["glare", "driving performance", "discomfort glare", "simulated low-beam light source", "road traffic", "urban road", "rural road", "highway", "deBoer rating scale"], "prmu": ["P", "P", "P", "M", "R", "R", "R", "P", "M"]} +{"id": "1608", "title": "A geometric process equivalent model for a multistate degenerative system", "abstract": "In this paper, a monotone process model for a one-component degenerative system with k+1 states (k failure states and one working state) is studied. We show that this model is equivalent to a geometric process (GP) model for a two-state one component system such that both systems have the same long-run average cost per unit time and the same optimal policy. Furthermore, an explicit expression for the determination of an optimal policy is derived", "keyphrases": ["multistate degenerative system", "geometric process equivalent model", "monotone process model", "one-component degenerative system", "failure states", "working state", "two-state one component system", "long-run average cost", "optimal policy", "replacement policy", "renewal reward process"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M"]} +{"id": "1923", "title": "Predictive control of a high temperature-short time pasteurisation process", "abstract": "Modifications on the dynamic matrix control (DMC) algorithm are presented to deal with transfer functions with varying parameters in order to control a high temperature-short time pasteurisation process. To control processes with first order with pure time delay models whose parameters present an exogenous variable dependence, a new method of free response calculation, using multiple model information, is developed. Two methods, to cope with those nonlinear models that allow a generalised Hammerstein model description, are proposed. The proposed methods have been tested, both in simulation and in real cases, in comparison with PID and DMC classic controllers, showing important improvements on reference tracking and disturbance rejection", "keyphrases": ["high temperature-short time pasteurisation process", "predictive control", "dynamic matrix control algorithm", "transfer functions", "first order processes", "time delay models", "exogenous variable dependence", "free response calculation", "multiple model information", "nonlinear models", "generalised Hammerstein model description", "reference tracking", "disturbance rejection"], "prmu": ["P", "P", "R", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1509", "title": "Mathematical modelling of the work of the system of wells in a layer with the exponential law of permeability variation and the mobile liquid interface", "abstract": "We construct and study a two-dimensional model of the work of the system of wells in a layer with the mobile boundary between liquids of various viscosity. We use a 'plunger' displacement model of liquids. The boundaries of the filtration region of these liquids are modelled by curves of the Lyapunov class. Unlike familiar work, we solve two-dimensonal problems in an inhomogeneous layer when the mobile boundary and the boundaries of the filtration region are modelled by curves of the Lyapunov class. We show the practical convergence of the numerical solution of the problems studied", "keyphrases": ["2D model", "work", "well system", "mathematical modelling", "exponential law", "permeability variation", "mobile liquid interface", "mobile boundary", "viscosity", "plunger displacement model", "filtration region boundaries", "Lyapunov class curves", "inhomogeneous layer", "convergence", "numerical solution"], "prmu": ["M", "P", "R", "P", "P", "P", "P", "P", "P", "M", "R", "R", "P", "P", "P"]} +{"id": "1886", "title": "Non-asymptotic confidence ellipsoids for the least-squares estimate", "abstract": "We consider the finite sample properties of least-squares system identification, and derive non-asymptotic confidence ellipsoids for the estimate. The shape of the confidence ellipsoids is similar to the shape of the ellipsoids derived using asymptotic theory, but unlike asymptotic theory, they are valid for a finite number of data points. The probability that the estimate belongs to a certain ellipsoid has a natural dependence on the volume of the ellipsoid, the data generating mechanism, the model order and the number of data points available", "keyphrases": ["nonasymptotic confidence ellipsoids", "least-squares estimate", "finite sample properties", "least-squares system identification", "probability", "data generating mechanism", "model order", "data points"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1750", "title": "A dynamic method for weighted linear least squares problems", "abstract": "A new method for solving the weighted linear least squares problems with full rank is proposed. Based on the theory of Liapunov's stability, the method associates a dynamic system with a weighted linear least squares problem, whose solution we are interested in and integrates the former numerically by an A-stable numerical method. The numerical tests suggest that the new method is more than comparative with current conventional techniques based on the normal equations", "keyphrases": ["dynamic method", "weighted linear least squares problems", "Lyapunov stability", "A-stable numerical method"], "prmu": ["P", "P", "M", "P"]} +{"id": "1715", "title": "Information-processing and computing systems at thermal power stations in China", "abstract": "The development and commissioning of information-processing and computing systems (IPCSs) at four power units, each of 500 MW capacity at the thermal power stations Tszisyan' and Imin' in China, are considered. The functional structure and the characteristics of the functions of the IPCSs are presented as is information on the technology of development and experience in adjustments. Ways of using the experience gained in creating a comprehensive functional firmware system are shown", "keyphrases": ["China", "thermal power stations", "information-processing systems", "computing systems", "commissioning", "development", "functional structure", "functions characteristics", "firmware system", "500 MW"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "R", "P", "P"]} +{"id": "1728", "title": "A characterization of generalized Pareto distributions by progressive censoring schemes and goodness-of-fit tests", "abstract": "In this paper we generalize a characterization property of generalized Pareto distributions, which is known for ordinary order statistics, to arbitrary schemes of progressive type-II censored order statistics. Various goodness-of-fit tests for generalized Pareto distributions based on progressively censored data statistics are discussed", "keyphrases": ["generalized Pareto distributions", "progressive censoring schemes", "goodness-of-fit tests", "progressive type-II censored order statistics", "ordinary order statistics"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1803", "title": "Linear complexity of polyphase power residue sequences", "abstract": "The well known family of binary Legendre or quadratic residue sequences can be generalised to the multiple-valued case by employing a polyphase representation. These p-phase sequences, with p prime, also have prime length L, and can be constructed from the index sequence of length L or, equivalently, from the cosets of pth power residues and non-residues modulo-L. The linear complexity of these polyphase sequences is derived and shown to fall into four classes depending on the value assigned to b/sub 0/, the initial digit of the sequence, and on whether p belongs to the set of pth power residues or not. The characteristic polynomials of the linear feedback shift registers that generate these sequences are also derived", "keyphrases": ["linear complexity", "polyphase power residue sequences", "binary Legendre sequences", "quadratic residue sequences", "multiple-valued case", "p-phase sequences", "polynomials", "linear feedback shift registers", "cryptographic applications", "key stream ciphers", "binary sequences"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "U", "U", "R"]} +{"id": "1491", "title": "Evaluation of videoconferenced grand rounds", "abstract": "We evaluated various aspects of grand rounds videoconferenced from a tertiary care hospital to a regional hospital in Nova Scotia. During a five-month study period, 29 rounds were broadcast (19 in medicine and 10 in cardiology). The total recorded attendance at the remote site was 103, comprising 70 specialists, nine family physicians and 24 other health-care professionals. We received 55 evaluations, a response rate of 53%. On a five-point Likert scale (on which higher scores indicated better quality), mean ratings by remote-site participants of the technical quality of the videoconference were 3.0-3.5, with the lowest ratings being for ability to hear the discussion (3.0) and to see visual aids (3.1). Mean ratings for content, presentation, discussion and educational value were 3.8 or higher. Of the 49 physicians who presented the rounds, we received evaluations from 41, a response rate of 84%. The presenters rated all aspects of the videoconference and interaction with remote sites at 3.8 or lower. The lowest ratings were for ability to see the remote sites (3.0) and the usefulness of the discussion (3.4). We received 278 evaluations from participants at the presenting site, an estimated response rate of about 55%. The results indicated no adverse opinions of the effect of videoconferencing (mean scores 3.1-3.3). The estimated costs of videoconferencing one grand round to one site and four sites were C$723 and C$1515, respectively. The study confirmed that videoconferenced rounds can provide satisfactory continuing medical education to community specialists, which is an especially important consideration as maintenance of certification becomes mandatory", "keyphrases": ["videoconferenced grand rounds", "tertiary care hospital", "regional hospital", "telemedicine", "cardiology", "health-care professionals", "five-point Likert scale", "remote sites", "continuing medical education", "certification"], "prmu": ["P", "P", "P", "U", "P", "P", "P", "P", "P", "P"]} +{"id": "1846", "title": "Semantic B2B integration: issues in ontology-based approaches", "abstract": "Solving queries to support e-commerce transactions can involve retrieving and integrating information from multiple information resources. Often, users don't care which resources are used to answer their query. In such situations, the ideal solution would be to hide from the user the details of the resources involved in solving a particular query. An example would be providing seamless access to a set of heterogeneous electronic product catalogues. There are many problems that must be addressed before such a solution can be provided. In this paper, we discuss a number of these problems, indicate how we have addressed these and go on to describe the proof-of-concept demonstration system we have developed", "keyphrases": ["e-commerce transactions", "queries", "information integration", "information retrieval", "multiple information resources", "heterogeneous electronic product catalogues", "ontology-based approaches", "semantic B2B integration"], "prmu": ["P", "P", "R", "R", "P", "P", "P", "P"]} +{"id": "1790", "title": "Copyright of electronic publishing", "abstract": "With the spreading of the Internet and the wide use of computers, electronic publishing is becoming an indispensable measure to gain knowledge and skills. Meanwhile, copyright is facing much more infringement than ever in this electronic environment. So, it is a key factor to effectively protect copyright of electronic publishing to foster the new publication fields. The paper analyzes the importance of copyright, the main causes for copyright infringement in electronic publishing, and presents viewpoints on the definition and application of fair use of a copyrighted work and thinking of some means to combat breach of copyright", "keyphrases": ["electronic publishing copyright", "Internet", "copyright infringement", "electronic environment", "copyright protection", "fair use", "copyrighted work"], "prmu": ["R", "P", "P", "P", "R", "P", "P"]} +{"id": "1534", "title": "Generic simulation approach for multi-axis machining. Part 1: modeling methodology", "abstract": "This paper presents a new methodology for analytically simulating multi-axis machining of complex sculptured surfaces. A generalized approach is developed for representing an arbitrary cutting edge design, and the local surface topology of a complex sculptured surface. A NURBS curve is used to represent the cutting edge profile. This approach offers the advantages of representing any arbitrary cutting edge design in a generic way, as well as providing standardized techniques for manipulating the location and orientation of the cutting edge. The local surface topology of the part is defined as those surfaces generated by previous tool paths in the vicinity of the current tool position. The local surface topology of the part is represented without using a computationally expensive CAD system. A systematic prediction technique is then developed to determine the instantaneous tool/part interaction during machining. The methodology employed here determines the cutting edge in-cut segments by determining the intersection between the NURBS curve representation of the cutting edge and the defined local surface topology. These in-cut segments are then utilized for predicting instantaneous chip load, static and dynamic cutting forces, and tool deflection. Part 1 of this paper details the modeling methodology and demonstrates the capabilities of the simulation for machining a complex surface", "keyphrases": ["multiple axis machining", "generic modeling", "tool path specification", "complex surface machining", "complex sculptured surfaces", "systematic prediction", "cutting edge profile", "surface topology", "NURBS curve"], "prmu": ["M", "R", "M", "R", "P", "P", "P", "P", "P"]} +{"id": "1571", "title": "The simulated emergence of distributed environmental control in evolving microcosms", "abstract": "This work continues investigation into Gaia theory (Lovelock, The ages of Gaia, Oxford University Press, 1995) from an artificial life perspective (Downing, Proceedings of the 7th International Conference on Artificial Life, p. 90-99, MIT Press, 2000), with the aim of assessing the general compatibility of emergent distributed environmental control with conventional natural selection. Our earlier system, GUILD (Downing and Zvirinsky, Artificial Life, 5, p.291-318, 1999), displayed emergent regulation of the chemical environment by a population of metabolizing agents, but the chemical model underlying those results was trivial, essentially admitting all possible reactions at a single energy cost. The new model, METAMIC, utilizes abstract chemistries that are both (a) constrained to a small set of legal reactions, and (b) grounded in basic fundamental relationships between energy, entropy, and biomass synthesis/breakdown. To explore the general phenomena of emergent homeostasis, we generate 100 different chemistries and use each as the basis for several METAMIC runs, as part of a Gaia hunt. This search discovers 20 chemistries that support microbial populations capable of regulating a physical environmental factor within their growth-optimal range, despite the extra metabolic cost. Case studies from the Gaia hunt illustrate a few simple mechanisms by which real biota might exploit the underlying chemistry to achieve some control over their physical environment. Although these results shed little light on the question of Gaia on Earth, they support the possibility of emergent environmental control at the microcosmic level", "keyphrases": ["simulated emergence", "evolving microcosms", "natural selection", "GUILD system", "metabolizing agents", "chemical model", "METAMIC model", "emergent homeostasis", "Gaia hunt", "genetic algorithms", "artificial chemistry", "artificial metabolisms", "Gaia theory", "artificial life", "emergent distributed environmental control"], "prmu": ["P", "P", "P", "R", "P", "P", "R", "P", "P", "U", "R", "R", "P", "P", "P"]} +{"id": "161", "title": "Electronic books: reports of their death have been exaggerated", "abstract": "E-books will survive, but not in the consumer market - at least not until reading devices become much cheaper and much better in quality (which is not likely to happen soon). Library Journal's review of major events of the year 2001 noted that two requirements for the success of E-books were development of a sustainable business model and development of better reading devices. The E-book revolution has therefore become more of an evolution. We can look forward to further developments and advances in the future", "keyphrases": ["electronic books", "E-books", "Library Journal"], "prmu": ["P", "P", "P"]} +{"id": "1635", "title": "Simple...But complex", "abstract": "FlexPro 5.0, from Weisang and Co., is one of those products which aim to serve an often ignored range of data users: those who, in FlexPro's words, are interested in documenting, analysing and archiving data in the simplest way possible. The online help system is clearly designed to promote the product in this market segment, with a very clear introduction from first principles and a hands-on tutorial, and the live project to which it was applied was selected with this in mind", "keyphrases": ["FlexPro 5.0", "data archiving", "data analysis", "data documentation", "online help system", "hands-on tutorial"], "prmu": ["P", "R", "M", "R", "P", "P"]} +{"id": "1670", "title": "An integrated optimization model for train crew management", "abstract": "Train crew management involves the development of a duty timetable for each of the drivers (crew) to cover a given train timetable in a rail transport organization. This duty timetable is spread over a certain period, known as the roster planning horizon. Train crew management may arise either from the planning stage, when the total number of crew and crew distributions are to be determined, or from the operating stage when the number of crew at each depot is known as input data. In this paper, we are interested in train crew management in the planning stage. In the literature, train crew management is decomposed into two stages: crew scheduling and crew rostering which are solved sequentially. We propose an integrated optimization model to solve both crew scheduling and crew rostering. The model enables us to generate either cyclic rosters or non-cyclic rosters. Numerical experiments are carried out over data sets arising from a practical application", "keyphrases": ["integrated optimization model", "train crew management", "duty timetable", "rail transport organization", "roster planning horizon", "crew scheduling", "crew rostering", "cyclic rosters", "noncyclic rosters", "integer programming"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "M", "U"]} +{"id": "1792", "title": "Database technology in digital libraries", "abstract": "Database technology advancements have provided many opportunities for libraries. These advancements can bring the world closer together through information accessibility. Digital library projects have been established worldwide to, ultimately, fulfil the needs of end users through more efficiency and convenience. Resource sharing will continue to be the trend for libraries. Changes often create issues which need to be addressed. Issues relating to database technology and digital libraries are reviewed. Some of the major challenges in digital libraries and managerial issues are identified as well", "keyphrases": ["database technology", "digital libraries", "information accessibility", "digital library projects", "end users", "resource sharing", "managerial issues", "data quality", "interoperability", "metadata", "user interface", "query processing"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "U", "U", "U", "M", "U"]} +{"id": "1801", "title": "Least load dispatching algorithm for parallel Web server nodes", "abstract": "A least load dispatching algorithm for distributing requests to parallel Web server nodes is described. In this algorithm, the load offered to a node by a request is estimated based on the expected transfer time of the corresponding reply through the Internet. This loading information is then used by the algorithm to identify the least load node of the Web site. By using this algorithm, each request will always be sent for service at the earliest possible time. Performance comparison using NASA and ClarkNet access logs between the proposed algorithm and commonly used dispatching algorithms is performed. The results show that the proposed algorithm gives 10% higher throughput than that of the commonly used random and round-robin dispatching algorithms", "keyphrases": ["least load dispatching algorithm", "parallel Web server nodes", "Internet", "transfer time", "NASA access logs", "ClarkNet access logs", "throughput", "round-robin dispatching algorithms", "random dispatching algorithms", "World Wide Web server"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "R", "M"]} +{"id": "1493", "title": "Research into telehealth applications in speech-language pathology", "abstract": "A literature review was conducted to investigate the extent to which telehealth has been researched within the domain of speech-language pathology and the outcomes of this research. A total of 13 studies were identified. Three early studies demonstrated that telehealth was feasible, although there was no discussion of the cost-effectiveness of this process in terms of patient outcomes. The majority of the subsequent studies indicated positive or encouraging outcomes resulting from telehealth. However, there were a number of shortcomings in the research, including a lack of cost-benefit information, failure to evaluate the technology itself, an absence of studies of the educational and informational aspects of telehealth in relation to speech-language pathology, and the use of telehealth in a limited range of communication disorders. Future research into the application of telehealth to speech-language pathology services must adopt a scientific approach, and have a well defined development and evaluation framework that addresses the effectiveness of the technique, patient outcomes and satisfaction, and the cost-benefit relationship", "keyphrases": ["telehealth applications", "speech-language pathology", "literature review", "telemedicine", "cost-effectiveness", "patient outcomes", "cost-benefit analysis", "communication disorders", "patient satisfaction"], "prmu": ["P", "P", "P", "U", "P", "P", "M", "P", "R"]} +{"id": "1844", "title": "A multi-agent system infrastructure for software component marketplace: an ontological perspective", "abstract": "In this paper, we introduce a multi-agent system architecture and an implemented prototype for a software component marketplace. We emphasize the ontological perspective by discussing ontology modeling for the component marketplace, UML extensions for ontology modeling, and the idea of ontology transfer which makes the multi-agent system adapt itself to dynamically changing ontologies", "keyphrases": ["multi-agent system architecture", "software component marketplace", "ontology modeling", "UML extensions", "ontology transfer", "dynamically changing ontologies", "adaptation"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1637", "title": "What's best practice for open access?", "abstract": "The business of publishing journals is in transition. Nobody knows exactly how it will work in the future, but everybody knows that the electronic publishing revolution will ensure it won't work as it does now. This knowledge has provoked a growing sense of nervous anticipation among those concerned, some edgy and threatened by potential changes to their business, others excited by the prospect of change and opportunity. The paper discusses the open publishing model for dissemination of research", "keyphrases": ["open access", "journal publishing", "electronic publishing", "business", "open publishing model", "research dissemination"], "prmu": ["P", "R", "P", "P", "P", "R"]} +{"id": "1672", "title": "Two issues in setting call centre staffing levels", "abstract": "Motivated by a problem facing the Police Communication Centre in Auckland, New Zealand, we consider the setting of staffing levels in a call centre with priority customers. The choice of staffing level over any particular time period (e.g., Monday from 8 am-9 am) relies on accurate arrival rate information. The usual method for identifying the arrival rate based on historical data can, in some cases, lead to considerable errors in performance estimates for a given staffing level. We explain why, identify three potential causes of the difficulty, and describe a method for detecting and addressing such a problem", "keyphrases": ["call centre staffing levels", "police communication centre", "Auckland", "New Zealand", "priority customers", "arrival rate information", "performance estimates", "forecast error", "nonstationarity", "conditional Poisson process"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M", "U", "U"]} +{"id": "1536", "title": "Connection management for QoS service on the Web", "abstract": "The current Web service model treats all requests equivalently, both while being processed by servers and while being transmitted over the network. For some uses, such as multiple priority schemes, different levels of service are desirable. We propose application-level TCP connection management mechanisms for Web servers to provide two different levels of Web service, high and low service, by setting different time-outs for inactive TCP connections. We evaluated the performance of the mechanism under heavy and light loading conditions on the Web server. Our experiments show that, though heavy traffic saturates the network, high level class performance is improved by as much as 25-28%. Therefore, this mechanism can effectively provide QoS guaranteed services even in the absence of operating system and network supports", "keyphrases": ["connection management", "Web service model", "Internet", "TCP connections", "time-outs", "quality of service", "telecommunication traffic", "client server system", "Web transaction"], "prmu": ["P", "P", "U", "P", "P", "M", "M", "M", "M"]} +{"id": "163", "title": "Boolean operators and the naive end-user: moving to AND", "abstract": "Since so few end-users make use of Boolean searching, it is obvious that any effective solution needs to take this reality into account. The most important aspect of a technical solution should be that it does not require any effort on the part of users. What is clearly needed is for search engine designers and programmers to take account of the information-seeking behavior of Internet users. Users must be able to enter a series of words at random and have those words automatically treated as a carefully constructed Boolean AND search statement", "keyphrases": ["Boolean operators", "AND operator", "Boolean searching", "search engine design", "information-seeking behavior", "Internet"], "prmu": ["P", "R", "P", "P", "P", "P"]} +{"id": "1921", "title": "An ACL for a dynamic system of agents", "abstract": "In this article we present the design of an ACL for a dynamic system of agents. The ACL includes a set of conversation performatives extended with operations to register, create, and terminate agents. The main design goal at the agent-level is to provide only knowledge-level primitives that are well integrated with the dynamic nature of the system. This goal has been achieved by defining an anonymous interaction protocol which enables agents to request and supply knowledge without considering symbol-level issues concerning management of agent names, routing, and agent reachability. This anonymous interaction protocol exploits a distributed facilitator schema which is hidden at the agent-level and provides mechanisms for registering capabilities of agents and delivering requests according to the competence of agents. We present a formal specification of the ACL and of the underlying architecture, exploiting an algebra of actors, and illustrate it with the help of a graphical notation. This approach provides the basis for discussing dynamic primitives in ACL and for studying properties of dynamic multi agent systems, for example concerning the behavior of agents and the correctness of their conversation policies", "keyphrases": ["ACL", "dynamic system of agents", "system of agents", "agents", "Agent Communication Languages", "dynamic system", "distributed facilitator", "actors", "anonymous interaction protocol"], "prmu": ["P", "P", "P", "P", "M", "P", "P", "P", "P"]} +{"id": "1879", "title": "On the distribution of Lachlan nonsplitting bases", "abstract": "We say that a computably enumerable (c.e.) degree b is a Lachlan nonsplitting base (LNB), if there is a computably enumerable degree a such that a>b, and for any c.e. degrees w, vor=2, and the function max({x/sub 1/,...,x/sub n/} intersection A) is partial recursive, it is easily seen that A is recursive. In this paper, we weaken this hypothesis in various ways (and similarly for \"min\" in place of \"max\") and investigate what effect this has on the complexity of A. We discover a sharp contrast between retraceable and co-retraceable sets, and we characterize sets which are the union of a recursive set and a co-r.e., retraceable set. Most of our proofs are noneffective. Several open questions are raised", "keyphrases": ["min limiters", "max limiters", "complexity", "retraceable sets", "recursive set"], "prmu": ["P", "R", "P", "P", "P"]} +{"id": "1716", "title": "The vibration reliability of poppet and contoured actuator valves", "abstract": "The problem of selecting the shape of the actuator valve (the final control valve) itself is discussed; the solution to this problem will permit appreciable dynamic loads to be eliminated from the moving elements of the steam distribution system of steam turbines under all operating conditions", "keyphrases": ["actuator valve shape selection", "contoured actuator valves", "poppet actuator valves", "dynamic loads elimination", "moving elements", "steam distribution system", "steam turbines", "vibration reliability"], "prmu": ["R", "P", "R", "R", "P", "P", "P", "P"]} +{"id": "1753", "title": "Risk theory with a nonlinear dividend barrier", "abstract": "In the framework of classical risk theory we investigate a surplus process in the presence of a nonlinear dividend barrier and derive equations for two characteristics of such a process, the probability of survival and the expected sum of discounted dividend payments. Number-theoretic solution techniques are developed for approximating these quantities and numerical illustrations are given for exponential claim sizes and a parabolic dividend barrier", "keyphrases": ["risk theory", "nonlinear dividend barrier", "surplus process", "probability of survival", "discounted dividend payments", "number-theoretic solution", "numerical illustrations", "exponential claim sizes", "parabolic dividend barrier"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1885", "title": "Analysis of nonlinear time-delay systems using modules over non-commutative rings", "abstract": "The theory of non-commutative rings is introduced to provide a basis for the study of nonlinear control systems with time delays. The left Ore ring of non-commutative polynomials defined over the field of a meromorphic function is suggested as the framework for such a study. This approach is then generalized to a broader class of nonlinear systems with delays that are called generalized Roesser systems. Finally, the theory is applied to analyze nonlinear time-delay systems. A weak observability is defined and characterized, generalizing the well-known linear result. Properties of closed submodules are then developed to obtain a result on the accessibility of such systems", "keyphrases": ["nonlinear time-delay systems", "modules", "noncommutative rings", "nonlinear control systems", "left Ore ring", "noncommutative polynomials", "meromorphic function", "generalized Roesser systems", "weak observability"], "prmu": ["P", "P", "M", "P", "P", "M", "P", "P", "P"]} +{"id": "1920", "title": "To commit or not to commit: modeling agent conversations for action", "abstract": "Conversations are sequences of messages exchanged among interacting agents. For conversations to be meaningful, agents ought to follow commonly known specifications limiting the types of messages that can be exchanged at any point in the conversation. These specifications are usually implemented using conversation policies (which are rules of inference) or conversation protocols (which are predefined conversation templates). In this article we present a semantic model for specifying conversations using conversation policies. This model is based on the principles that the negotiation and uptake of shared social commitments entail the adoption of obligations to action, which indicate the actions that agents have agreed to perform. In the same way, obligations are retracted based on the negotiation to discharge their corresponding shared social commitments. Based on these principles, conversations are specified as interaction specifications that model the ideal sequencing of agent participations negotiating the execution of actions in a joint activity. These specifications not only specify the adoption and discharge of shared commitments and obligations during an activity, but also indicate the commitments and obligations that are required (as preconditions) or that outlive a joint activity (as postconditions). We model the Contract Net Protocol as an example of the specification of conversations in a joint activity", "keyphrases": ["interacting agents", "specifications", "rules of inference", "conversation protocols", "autonomous agents", "social commitments", "speech acts", "software agents", "conversation templates"], "prmu": ["P", "P", "P", "P", "M", "P", "U", "M", "P"]} +{"id": "1673", "title": "Mission planning for regional surveillance", "abstract": "The regional surveillance problem discussed involves formulating a flight route for an aircraft to scan a given geographical region. Aerial surveillance is conducted using a synthetic aperture radar device mounted on the aircraft to compose a complete, high-resolution image of the region. Two models for determining an optimised flight route are described, the first employing integer programming and the second, genetic algorithms. A comparison of the solution optimality in terms of the total distance travelled, and model efficiency of the two techniques in terms of their required CPU times, is made in order to identify the conditions under which it is appropriate to apply each model", "keyphrases": ["mission planning", "regional surveillance", "flight route", "geographical region scanning", "aerial surveillance", "synthetic aperture radar device", "high-resolution image", "optimised flight route", "integer programming", "genetic algorithms", "solution optimality", "total distance travelled"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1636", "title": "SPARC ignites scholarly publishing", "abstract": "During the past several years, initiatives which bring together librarians, researchers, university administrators and independent publishers have re-invigorated the scholarly publishing marketplace. These initiatives take advantage of electronic technology and show great potential for restoring science to scientists. The author outlines SPARC (the Scholarly Publishing and Academic Resources Coalition), an initiative to make scientific journals more accessible", "keyphrases": ["electronic publishing", "initiative", "scientific journal access", "SPARC", "Scholarly Publishing and Academic Resources Coalition"], "prmu": ["R", "P", "R", "P", "P"]} +{"id": "1572", "title": "Ant colony optimization and stochastic gradient descent", "abstract": "We study the relationship between the two techniques known as ant colony optimization (ACO) and stochastic gradient descent. More precisely, we show that some empirical ACO algorithms approximate stochastic gradient descent in the space of pheromones, and we propose an implementation of stochastic gradient descent that belongs to the family of ACO algorithms. We then use this insight to explore the mutual contributions of the two techniques", "keyphrases": ["ant colony optimization", "stochastic gradient descent", "empirical ACO algorithms", "pheromones", "combinatorial optimization", "heuristic", "reinforcement learning", "social insects", "swarm intelligence", "artificial life", "local search algorithms"], "prmu": ["P", "P", "P", "P", "M", "U", "U", "U", "U", "U", "M"]} +{"id": "1537", "title": "Technology on social issues of videoconferencing on the Internet: a survey", "abstract": "Constant advances in audio/video compression, the development of the multicast protocol as well as fast improvement in computing devices (e.g. higher speed, larger memory) have set forth the opportunity to have resource demanding videoconferencing (VC) sessions on the Internet. Multicast is supported by the multicast backbone (Mbone), which is a special portion of the Internet where this protocol is being deployed. Mbone VC tools are steadily emerging and the user population is growing fast. VC is a fascinating application that has the potential to greatly impact the way we remotely communicate and work. Yet, the adoption of VC is not as fast as one could have predicted. Hence, it is important to examine the factors that affect a widespread adoption of VC. This paper examines the enabling technology and the social issues. It discusses the achievements and identifies the future challenges. It suggests an integration of many emerging multimedia tools into VC in order to enhance its versatility for more effectiveness", "keyphrases": ["videoconferencing", "Internet", "multicast protocol", "multicast backbone", "Mbone", "multimedia", "social issues", "data compression"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "162", "title": "International news sites in English", "abstract": "Web access to news sites all over the world allows us the opportunity to have an electronic news stand readily available and stocked with a variety of foreign (to us) news sites. A large number of currently available foreign sites are English-language publications or English language versions of non-North American sites. These sites are quite varied in terms of quality, coverage, and style. Finding them can present a challenge. Using them effectively requires critical-thinking skills that are a part of media awareness or digital literacy", "keyphrases": ["Web access", "international news sites", "English-language publications", "non North American sites", "critical-thinking skills", "media awareness", "digital literacy"], "prmu": ["P", "P", "P", "M", "P", "P", "P"]} +{"id": "1793", "title": "The paradigm of viral communication", "abstract": "The IIW Institute of Information Management (www.IIW.de) is dealing with commercial applications of digital technologies, such as the Internet, digital printing, and many more. A study which has been carried out by the institute, identifies viral messages as a new paradigm of communication, mostly found in the area of Direct Marketing, and - who wonders - mainly within the USA. Viral messages underlie certain principles: (1) prospects and customers of the idea are offered a technology platform providing a possibility to send a message to a majority of persons; (2) there is an emotional or pecuniary incentive to participate. Ideally, niches of needs and market vacua are filled with funny ideas; (3) also, the recipients are facing emotional or pecuniary incentives to contact a majority of further recipients - this induces a snowball effect and the message is spread virally; and (4) the customer is activated as an \"ambassador\" of the piece of information, for instance promoting a product or a company. It is evident that there has been a long lasting history of what we call \"word-of-mouth\" ever since, however bundles of digital technologies empower the viral communication paradigm", "keyphrases": ["viral communication paradigm", "commercial applications", "viral messages", "e-mails", "Internet", "direct marketing", "business", "computer virus"], "prmu": ["P", "P", "P", "U", "P", "P", "U", "U"]} +{"id": "1845", "title": "Business data management for business-to-business electronic commerce", "abstract": "Business-to-business electronic commerce (B2B EC) opens up new possibilities for trade. For example, new business partners from around the globe can be found, their offers can be compared, even complex negotiations can be conducted electronically, and a contract can be drawn up and fulfilled via an electronic marketplace. However, sophisticated data management is required to provide such facilities. In this paper, the results of a multi-national project on creating a business-to-business electronic marketplace for small and medium-sized enterprises are presented. Tools for information discovery, protocol-based negotiations, and monitored contract enactment are provided and based on a business data repository. The repository integrates heterogeneous business data with business communication. Specific problems such as multilingual nature, data ownership, and traceability of contracts and related negotiations are addressed and it is shown that the present approach provides efficient business data management for B2B EC", "keyphrases": ["business-to-business electronic commerce", "business data management", "electronic marketplace", "small and medium-sized enterprises", "multi-national project", "information discovery", "protocol-based negotiations", "monitored contract enactment", "business data repository", "heterogeneous business data", "business communication", "data ownership", "multilingual system", "traceability"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "P"]} +{"id": "1800", "title": "Multi-output regression using a locally regularised orthogonal least-squares algorithm", "abstract": "The paper considers data modelling using multi-output regression models. A locally regularised orthogonal least-squares (LROLS) algorithm is proposed for constructing sparse multi-output regression models that generalise well. By associating each regressor in the regression model with an individual regularisation parameter, the ability of the multi-output orthogonal least-squares (OLS) model selection to produce a parsimonious model with a good generalisation performance is greatly enhanced", "keyphrases": ["multi-output regression models", "locally regularised orthogonal least-squares algorithm", "data modelling", "sparse multi-output regression models", "parsimonious model", "nonlinear system modelling", "LROLS algorithm"], "prmu": ["P", "P", "P", "P", "P", "M", "R"]} +{"id": "1492", "title": "A systematic review of the efficacy of telemedicine for making diagnostic and management decisions", "abstract": "We conducted a systematic review of the literature to evaluate the efficacy of telemedicine for making diagnostic and management decisions in three classes of application: office/hospital-based, store-and-forward, and home-based telemedicine. We searched the MEDLINE, EMBASE, CINAHL and HealthSTAR databases and printed resources, and interviewed investigators in the field. We excluded studies where the service did not historically require face-to-face encounters (e.g. radiology or pathology diagnosis). A total of 58 articles met the inclusion criteria. The articles were summarized and graded for the quality and direction of the evidence. There were very few high-quality studies. The strongest evidence for the efficacy of telemedicine for diagnostic and management decisions came from the specialties of psychiatry and dermatology. There was also reasonable evidence that general medical history and physical examinations performed via telemedicine had relatively good sensitivity and specificity. Other specialties in which some evidence for efficacy existed were cardiology and certain areas of ophthalmology. Despite the widespread use of telemedicine in most major medical specialties, there is strong evidence in only a few of them that the diagnostic and management decisions provided by telemedicine are comparable to face-to-face care", "keyphrases": ["telemedicine", "medical diagnosis", "management decision making", "literature review", "MEDLINE", "EMBASE", "CINAHL", "HealthSTAR", "psychiatry", "dermatology", "cardiology", "ophthalmology"], "prmu": ["P", "R", "R", "R", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1556", "title": "Regularity of some 'incomplete' Pal-type interpolation problems", "abstract": "In this paper the regularity of nine Pal-type interpolation problems is proved. In the literature interpolation on the zeros of the pair W/sub n//sup ( alpha )/(z) = (z + alpha )/sup n/ + (1 + alpha z)/sup n/, v/sub n//sup ( alpha )/(z) = (z + alpha )/sup n/ - (1 + alpha z)/sup n/ with 0 < alpha < 1 has been studied. Here the nodes form a subset of these sets of zeros", "keyphrases": ["Pal-type interpolation problems", "zeros"], "prmu": ["P", "P"]} +{"id": "1513", "title": "Solution of the reconstruction problem of a source function in the coagulation-fragmentation equation", "abstract": "We study the problem of reconstructing a source function in the kinetic coagulation-fragmentation equation. The study is based on optimal control methods, the solvability theory of operator equations, and the use of iteration algorithms", "keyphrases": ["source function reconstruction", "kinetic coagulation-fragmentation equation", "optimal control methods", "solvability", "operator equations", "iteration algorithms"], "prmu": ["R", "P", "P", "P", "P", "P"]} +{"id": "1657", "title": "Breaking the myths of rewards: an exploratory study of attitudes about knowledge sharing", "abstract": "Many CEO and managers understand the importance of knowledge sharing among their employees and are eager to introduce the knowledge management paradigm in their organizations. However little is known about the determinants of the individual's knowledge sharing behavior. The purpose of this study is to develop an understanding of the factors affecting the individual's knowledge sharing behavior in the organizational context. The research model includes various constructs based on social exchange theory, self-efficacy, and theory of reasoned action. Research results from the field survey of 467 employees of four large, public organizations show that expected associations and contribution are the major determinants of the individual's attitude toward knowledge sharing. Expected rewards, believed by many to be the most important motivating factor for knowledge sharing, are not significantly related to the attitude toward knowledge sharing. As expected, positive attitude toward knowledge sharing is found to lead to positive intention to share knowledge and, finally, to actual knowledge sharing behaviors", "keyphrases": ["knowledge sharing", "knowledge management", "social exchange theory", "self-efficacy", "theory of reasoned action", "public organizations", "rewards", "strategic management"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M"]} +{"id": "1861", "title": "Technology in distance education: a global perspective to alternative delivery mechanisms", "abstract": "Technology is providing a positive impact on delivery mechanisms employed in distance education at the university level. Some institutions are incorporating distance education as a way to extend the classroom. Other institutions are investigating new delivery mechanisms, which support a revised perspective on education. These latter institutions are revising their processes for interacting with students, and taking a more \"learner centered\" approach to the delivery of education. This article discusses the impact of technology on the delivery mechanisms employed in distance education. A framework is proposed here, which presents a description of alternative modes of generic delivery mechanisms. It is suggested that those institutions, which adopt a delivery mechanism employing an asynchronous mode, can gain the most benefit from technology. This approach seems to represent the only truly innovative use of technology in distance education. The approach creates a student-oriented environment while maintaining high levels of interaction, both of which are factors that contribute to student satisfaction with their overall educational experience", "keyphrases": ["distance education", "educational technology", "university education", "learner centered approach", "student satisfaction", "global perspective", "asynchronous mode"], "prmu": ["P", "R", "R", "R", "P", "P", "P"]} +{"id": "1824", "title": "Parallel operation of capacity-limited three-phase four-wire active power filters", "abstract": "Three-phase four-wire active power filters (APFs) are presented that can be paralleled to enlarge the system capacity and reliability. The APF employs the PWM four-leg voltage-source inverter. A decoupling control approach for the leg connected to the neutral line is proposed such that the switching of all legs has no interaction. Functions of the proposed APF include compensation of reactive power, harmonic current, unbalanced power and zero-sequence current of the load. The objective is to achieve unity power factor, balanced line current and zero neutral-line current. Compensation of all components is capacity-limited, co-operating with the cascaded load current sensing scheme. Multiple APFs can be paralleled to share the load power without requiring any control interconnection. In addition to providing the theoretic bases and detailed design of the APFs, two 6 kVA APFs are implemented. The effectiveness of the proposed method is validated with experimental results", "keyphrases": ["capacity-limited three-phase four-wire active power filters", "parallel operation", "PWM four-leg voltage-source inverter", "decoupling control approach", "leg switching", "control design", "reactive power compensation", "harmonic current compensation", "unbalanced power compensation", "zero-sequence load current compensation", "unity power factor", "balanced line current", "zero neutral-line current", "load power sharing", "control performance", "6 kVA"], "prmu": ["P", "P", "P", "P", "R", "R", "R", "R", "R", "R", "P", "P", "P", "R", "M", "P"]} +{"id": "1476", "title": "The perceived utility of human and automated aids in a visual detection task", "abstract": "Although increases in the use of automation have occurred across society, research has found that human operators often underutilize (disuse) and overly rely on (misuse) automated aids (Parasuraman-Riley (1997)). Nearly 275 Cameron University students participated in 1 of 3 experiments performed to examine the effects of perceived utility (Dzindolet et al. (2001)) on automation use in a visual detection task and to compare reliance on automated aids with reliance on humans. Results revealed a bias for human operators to rely on themselves. Although self-report data indicate a bias toward automated aids over human aids, performance data revealed that participants were more likely to disuse automated aids than to disuse human aids. This discrepancy was accounted for by assuming human operators have a \"perfect automation\" schema. Actual or potential applications of this research include the design of future automated decision aids and training procedures for operators relying on such aids", "keyphrases": ["automated aids", "visual detection task", "human operators", "automated decision aids", "social process", "automation"], "prmu": ["P", "P", "P", "P", "U", "P"]} +{"id": "1732", "title": "Community spirit", "abstract": "IT companies that contribute volunteers, resources or funding to charities and local groups not only make a real difference to their communities but also add value to their businesses. So says a new coalition of IT industry bodies formed to raise awareness of the options for community involvement, promote the business case, and publicise examples of best practice. The BCS, Intellect (formed from the merger of the Computing Services and Software Association and the Federation of the Electronics Industry) and the Worshipful Company of Information Technologists plan to run advisory seminars and provide guidelines on how companies of all sizes can transform their local communities using their specialist IT skills and resources while reaping business benefits", "keyphrases": ["IT companies", "volunteer staff", "resource contribution", "charity projects", "community projects", "staff development", "business benefits", "best practice"], "prmu": ["P", "M", "R", "M", "M", "U", "P", "P"]} +{"id": "1777", "title": "Midlife career choices: how are they different from other career choices?", "abstract": "It was 1963 when Candy Start began working in libraries. Libraries seemed to be a refuge from change, a dependable environment devoted primarily to preservation. She was mistaken. Technological changes in every decade of her experience have affected how and where she used her MLS. Far from a static refuge, libraries have proven to be spaceships loaded with precious cargo hurtling into the unknown. The historian in the author says that perhaps libraries have always been like this. This paper looks at a midlife decision point and the choice that this librarian made to move from a point of lessening productivity and interest to one of increasing challenge and contribution. It is a personal narrative of midlife experience from one librarian's point of view. Since writing this article, Candy's career has followed more changes. After selling the WINGS TM system, she has taken her experiences and vision to another library vendor, Gaylord Information Systems, where she serves as a senior product strategist", "keyphrases": ["midlife career choices", "libraries", "technological changes", "productivity"], "prmu": ["P", "P", "P", "P"]} +{"id": "1819", "title": "Structural interpretation of matched pole-zero discretisation", "abstract": "Deals with matched pole-zero discretisation, which has been used in practice for hand calculations in the digital redesign of continuous-time systems but available only in the transfer-function form. Since this form is inconvenient for characterising the time-domain properties of sampled-data loops and for computerising the design of such systems, a state-space formulation is developed. Under the new interpretation, the matched pole-zero model is shown to be structurally identical to a hold-equivalent discrete-time model, where the generalised hold takes integral part, thus unifying the most widely used discretisation approaches. An algorithm for obtaining the generalised hold function is presented. The hold-equivalent structure of the matched pole-zero model clarifies several discrete-time system properties, such as controllability and observability, and their preservation or loss with a matched pole-zero discretisation. With the proposed formulation, the matched pole-zero, hold-equivalent, and mapping models can now all be constructed with a single schematic model", "keyphrases": ["structural interpretation", "matched pole-zero discretisation", "continuous-time systems", "time-domain properties", "sampled-data loops", "state-space formulation", "hold-equivalent discrete-time model", "controllability", "observability", "closed-loop system", "digital simulations"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M"]} +{"id": "186", "title": "The diameter of a long-range percolation graph", "abstract": "We consider the following long-range percolation model: an undirected graph with the node set {0, 1, . . . , N}/sup d/, has edges (x, y) selected with probability approximately= beta /||x - y||/sup s/ if ||x - y|| > 1, and with probability 1 if ||x - y|| = 1, for some parameters beta , s > 0. This model was introduced by who obtained bounds on the diameter of this graph for the one-dimensional case d = 1 and for various values of s, but left cases s = 1, 2 open. We show that, with high probability, the diameter of this graph is Theta (log N/log log N) when s = d, and, for some constants 0 < eta /sub 1/ < eta /sub 2/ < 1, it is at most N/sup eta 2/ when s = 2d, and is at least N/sup eta 1/ when d = 1, s = 2, beta < 1 or when s > 2d. We also provide a simple proof that the diameter is at most log/sup O(1)/ N with high probability, when d < s < 2d, established previously in Benjamini and Berger (2001)", "keyphrases": ["long-range percolation model", "undirected graph", "probability", "percolation", "positive probability", "networks", "random graph"], "prmu": ["P", "P", "P", "P", "M", "U", "M"]} +{"id": "1904", "title": "Component support in PLT scheme", "abstract": "PLT Scheme (DrScheme and MzScheme) supports the Component Object Model (COM) standard with two pieces of software. The first piece is MzCOM, a COM class that makes a Scheme evaluator available to COM clients. With MzCOM, programmers can embed Scheme code in programs written in mainstream languages such as C++ or Visual BASIC. Some applications can also be used as MzCOM clients. The other piece of component-support software is MysterX, which makes COM classes available to PLT Scheme programs. When needed, MysterX uses a programmable Web browser to display COM objects. We describe the technical issues encountered in building these two systems and sketch some applications", "keyphrases": ["PLT Scheme", "Component Object Model", "MzCOM", "reuse", "Web browser"], "prmu": ["P", "P", "P", "U", "P"]} +{"id": "1596", "title": "Wavelet collocation methods for a first kind boundary integral equation in acoustic scattering", "abstract": "In this paper we consider a wavelet algorithm for the piecewise constant collocation method applied to the boundary element solution of a first kind integral equation arising in acoustic scattering. The conventional stiffness matrix is transformed into the corresponding matrix with respect to wavelet bases, and it is approximated by a compressed matrix. Finally, the stiffness matrix is multiplied by diagonal preconditioners such that the resulting matrix of the system of linear equations is well conditioned and sparse. Using this matrix, the boundary integral equation can be solved effectively", "keyphrases": ["first kind integral operators", "piecewise constant collocation", "wavelet algorithm", "boundary element solution", "boundary integral equation", "wavelet transform", "computational complexity", "acoustic scattering", "stiffness matrix", "linear equations"], "prmu": ["M", "P", "P", "P", "P", "R", "U", "P", "P", "P"]} +{"id": "1697", "title": "Exact frequency-domain reconstruction for thermoacoustic tomography. II. Cylindrical geometry", "abstract": "For pt. I see ibid., vol. 21, no. 7, p. 823-8 (2002). Microwave-induced thermoacoustic tomography (TAT) in a cylindrical configuration is developed to image biological tissue. Thermoacoustic signals are acquired by scanning a flat ultrasonic transducer. Using a new expansion of a spherical wave in cylindrical coordinates, we apply the Fourier and Hankel transforms to TAT and obtain an exact frequency-domain reconstruction method. The effect of discrete spatial sampling on image quality is analyzed. An aliasing-proof reconstruction method is proposed. Numerical and experimental results are included", "keyphrases": ["medical diagnostic imaging", "frequency-domain reconstruction", "flat ultrasonic transducer", "thermoacoustic tomography", "cylindrical geometry", "discrete spatial sampling effect", "ultrasound imaging", "spherical wave expansion", "aliasing-proof reconstruction method", "Hankel transform"], "prmu": ["M", "P", "P", "P", "P", "R", "M", "R", "P", "P"]} +{"id": "1489", "title": "An eight-year study of Internet-based remote medical counselling", "abstract": "We carried out a prospective study of an Internet-based remote counselling service. A total of 15,456 Internet users visited the Web site over eight years. From these, 1500 users were randomly selected for analysis. Medical counselling had been granted to 901 of the people requesting it (60%). One hundred and sixty-four physicians formed project groups to process the requests and responded using email. The distribution of patients using the service was similar to the availability of the Internet: 78% were from the European Union, North America and Australia. Sixty-seven per cent of the patients lived in urban areas and the remainder were residents of remote rural areas with limited local medical coverage. Sixty-five per cent of the requests were about problems of internal medicine and 30% of the requests concerned surgical issues. The remaining 5% of the patients sought information about recent developments, such as molecular medicine or aviation medicine. During the project, our portal became inaccessible five times, and counselling was not possible on 44 days. There was no hacking of the Web site. Internet-based medical counselling is a helpful addition to conventional practice", "keyphrases": ["Internet-based remote medical counselling", "Internet users", "Web site", "email", "urban areas", "remote rural areas", "surgical issues", "telemedicine", "medical education", "portal"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "U", "M", "P"]} +{"id": "1730", "title": "Meeting of minds", "abstract": "Technical specialists need to think about their role in IT projects and how they communicate with end-users and other participants to ensure they contribute fully as team members. It is especially important to communicate and document trade-offs that may have to be made, including the rationale behind them, so that if requirements change, the impact and decisions can be readily communicated to the stakeholders", "keyphrases": ["technical specialists", "IT projects", "communication", "end-users"], "prmu": ["P", "P", "P", "P"]} +{"id": "1775", "title": "Are we there yet?: facing the never-ending speed and change of technology in midlife", "abstract": "This essay is a personal reflection on entering librarianship in middle age at a time when the profession, like society in general, is experiencing rapidly accelerating change. Much of this change is due to the increased use of computers and information technologies in the library setting. These aids in the production, collection, storage, retrieval, and dissemination of the collective information, knowledge, and sometimes wisdom of the past and the contemporary world can exhilarate or burden depending on one's worldview, the organization, and the flexibility of the workplace. This writer finds herself working in a library where everyone is expected continually to explore and use new ways of working and providing library service to a campus and a wider community. No time is spent in reflecting on what was, but all efforts are to anticipate and prepare for what will be", "keyphrases": ["librarianship", "middle age", "changing technology", "computers", "information technologies", "dissemination", "retrieval", "storage", "collection"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "P"]} +{"id": "1788", "title": "Resolving Web user on the fly", "abstract": "Identity authentication systems and procedures are rapidly becoming central issues in the practice and study of information systems development and security. Requirements for Web transaction security (WTS) include strong authentication of a user, non-repudiation and encryption of all traffic. In this paper, we present an effective mechanism involving two different channels, which addresses the prime concerns involved in the security of electronic commerce transactions (ECT) viz. user authentication and non-repudiation. Although the product is primarily targeted to provide a fillip to transactions carried out over the Web, this product can also be effectively used for non-Internet transactions that are carried out where user authentication is required", "keyphrases": ["identity authentication systems", "information systems development", "information systems security", "Web transaction security", "nonrepudiation", "encryption", "traffic", "electronic commerce transactions"], "prmu": ["P", "P", "R", "P", "U", "P", "P", "P"]} +{"id": "1474", "title": "Contrast sensitivity in a dynamic environment: effects of target conditions and visual impairment", "abstract": "Contrast sensitivity was determined as a function of target velocity (0 degrees -120 degrees /s) over a variety of viewing conditions. In Experiment 1, measurements of dynamic contrast sensitivity were determined for observers as a function of target velocity for letter stimuli. Significant main effects were found for target velocity, target size, and target duration, but significant interactions among the variables indicated especially pronounced adverse effects of increasing target velocity for small targets and brief durations. In Experiment 2, the effects of simulated cataracts were determined. Although the simulated impairment had no effect on traditional acuity scores, dynamic contrast sensitivity was markedly reduced. Results are discussed in terms of dynamic contrast sensitivity as a useful composite measure of visual functioning that may provide a better overall picture of an individual's visual functioning than does traditional static acuity, dynamic acuity, or contrast sensitivity alone. The measure of dynamic contrast sensitivity may increase understanding of the practical effects of various conditions, such as aging or disease, on the visual system, or it may allow improved prediction of individuals' performance in visually dynamic situations", "keyphrases": ["contrast sensitivity", "dynamic environment", "target conditions", "visual impairment", "dynamic contrast sensitivity", "target velocity", "target size", "target duration", "acuity scores", "aging", "disease"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1695", "title": "Medical image computing at the Institute of Mathematics and Computer Science in Medicine, University Hospital Hamburg-Eppendorf", "abstract": "The author reviews the history of medical image computing at his institute, summarizes the achievements, sketches some of the difficulties encountered, and draws conclusions that might be of interest especially to people new to the field. The origin and history section provides a chronology of this work, emphasizing the milestones reached during the past three decades. In accordance with the author's group's focus on imaging, the paper is accompanied by many pictures, some of which, he thinks, are of historical value", "keyphrases": ["Institute of Mathematics and Computer Science in Medicine", "University Hospital Hamburg-Eppendorf", "medical image computing history", "historical value", "difficulties encountered", "medical diagnostic imaging", "work chronology"], "prmu": ["P", "P", "R", "P", "P", "M", "R"]} +{"id": "1569", "title": "An interactive self-replicator implemented in hardware", "abstract": "Self-replicating loops presented to date are essentially worlds unto themselves, inaccessible to the observer once the replication process is launched. We present the design of an interactive self-replicating loop of arbitrary size, wherein the user can physically control the loop's replication and induce its destruction. After introducing the BioWall, a reconfigurable electronic wall for bio-inspired applications, we describe the design of our novel loop and delineate its hardware implementation in the wall", "keyphrases": ["interactive self-replicator", "interactive self-replicating loop", "BioWall", "reconfigurable electronic wall", "bio-inspired applications", "hardware implementation", "self-replication", "field programmable gate array", "cellular automata", "reconfigurable computing", "artificial life"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "U", "U", "M", "U"]} +{"id": "179", "title": "Document-based workflow modeling: a case-based reasoning approach", "abstract": "A workflow model is useful for business process analysis. A well-built workflow can help a company streamline its internal processes by reducing overhead. The results of workflow modeling need to be managed as information assets in a systematic fashion. Reusing these results is likely to enhance the quality of the modeling. Therefore, this paper proposes a document-based workflow modeling mechanism, which employs a case-based reasoning (CBR) technique for the effective reuse of design outputs. A repository is proposed to support this CBR process. A real-life case is illustrated to demonstrate the usefulness of our approach", "keyphrases": ["document-based workflow modeling", "case-based reasoning", "business process analysis", "company", "information assets", "design output reuse"], "prmu": ["P", "P", "P", "P", "P", "R"]} +{"id": "184", "title": "On the expected value of the minimum assignment", "abstract": "The minimum k-assignment of an m*n matrix X is the minimum sum of k entries of X, no two of which belong to the same row or column. Coppersmith and Sorkin conjectured that if X is generated by choosing each entry independently from the exponential distribution with mean 1, then the expected value of its minimum k-assignment is given by an explicit formula, which has been proven only in a few cases. In this paper we describe our efforts to prove the Coppersmith-Sorkin conjecture by considering the more general situation where the entries x/sub ij/ of X are chosen independently from different distributions. In particular, we require that x/sub ij/ be chosen from the exponential distribution with mean 1/r/sub i/c/sub j/. We conjecture an explicit formula for the expected value of the minimum k-assignment of such X and give evidence for this formula", "keyphrases": ["minimum k-assignment", "m * n matrix", "exponential distribution", "rational function", "bipartite graph"], "prmu": ["P", "P", "P", "U", "U"]} +{"id": "1906", "title": "Integrated process control using an in situ sensor for etch", "abstract": "The migration to tighter geometries and more complex process sequence integration schemes requires having the ability to compensate for upstream deviations from target specifications. Doing so ensures that-downstream process sequences operate on work-in-progress that is well within control. Because point-of-use visibility of work-in-progress quality has become of paramount concern in the industry's drive to reduce scrap and improve yield, controlling trench depth has assumed greater importance. An integrated, interferometric based, rate monitor for etch-to-depth and spacer etch applications has been developed for controlling this parameter. This article demonstrates that the integrated rate monitor, using polarization and digital signal processing, enhances control etch-to-depth processes and can also be implemented as a predictive endpoint in a wafer manufacturing environment for dual damascene trench etch and spacer etch applications", "keyphrases": ["interferometric in situ etch sensor", "integrated process control", "polarization", "digital signal processing", "wafer manufacturing environment", "process predictive endpoint", "dual damascene trench etch", "spacer etch applications", "IC geometry", "complex process sequence integration schemes", "upstream deviation compensation", "target specifications", "downstream process sequences", "point-of-use visibility", "work-in-progress quality", "scrap reduction", "yield improvement", "trench depth control", "interferometry", "integrated etch rate monitor"], "prmu": ["R", "P", "P", "P", "P", "R", "P", "P", "M", "P", "R", "P", "M", "P", "P", "M", "R", "R", "U", "R"]} +{"id": "1594", "title": "Training multilayer perceptrons via minimization of sum of ridge functions", "abstract": "Motivated by the problem of training multilayer perceptrons in neural networks, we consider the problem of minimizing E(x)= Sigma /sub i=1//sup n/ f/sub i/( xi /sub i/.x), where xi /sub i/ in R/sup S/, 1or= 0} are investigated, where || . ||/sub p/ is the usual vector norm in C/sup n/ resp. R/sup n/, for p epsilon [1, o infinity ]. Moreover, formulae for the first three right derivatives D/sub +//sup k/||s(t)||/sub p/, k = 1, 2,3 are determined. These formulae are applied to vibration problems by computing the best upper bounds on ||s(t)||/sub p/ in certain classes of bounds. These results cannot be obtained by the methods used so far. The systematic use of the differential calculus for vector norms, as done here for the first time, could lead to major advances also in other branches of mathematics and other sciences", "keyphrases": ["differential calculus", "vector functions", "mapping", "vibration problems", "vector norms"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1511", "title": "Efficient algorithms for stiff elliptic problems with large parameters", "abstract": "We consider a finite element approximation and iteration algorithms for solving stiff elliptic boundary value problems with large parameters in front of a higher derivative. The convergence rate of the algorithms is independent of the spread in coefficients and a discretization parameter", "keyphrases": ["finite element approximation", "iteration algorithms", "stiff elliptic boundary value problems", "large parameters", "higher derivative", "efficient algorithms", "convergence rate"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1863", "title": "Information systems project failure: a comparative study of two countries", "abstract": "Many organizations, regardless of size, engage in at least one, and often many information system projects each year. Many of these projects consume massive amounts of resources, and may cost as little as a few thousand dollars to ten, and even hundreds of millions of dollars. Needless to say, the investment of time and resources into these ventures are of significant concern to chief information officers (CIOs), executives staff members, project managers, and others in leadership positions. This paper describes the results of a survey performed between Australia and the United States regarding factors leading to IS project failure. The findings suggest that, among other things, end user involvement and executive management leadership are key indicators influencing IS project failure", "keyphrases": ["information systems project failure", "Australia", "United States", "end user involvement", "executive management leadership"], "prmu": ["P", "P", "P", "P", "P"]} +{"id": "1826", "title": "Modeling shape and topology of low-resolution density maps of biological macromolecules", "abstract": "We develop an efficient way of representing the geometry and topology of volumetric datasets of biological structures from medium to low resolution, aiming at storing and querying them in a database framework. We make use of a new vector quantization algorithm to select the points within the macromolecule that best approximate the probability density function of the original volume data. Connectivity among points is obtained with the use of the alpha shapes theory. This novel data representation has a number of interesting characteristics, such as (1) it allows us to automatically segment and quantify a number of important structural features from low-resolution maps, such as cavities and channels, opening the possibility of querying large collections of maps on the basis of these quantitative structural features; (2) it provides a compact representation in terms of size; (3) it contains a subset of three-dimensional points that optimally quantify the densities of medium resolution data; and (4) a general model of the geometry and topology of the macromolecule (as opposite to a spatially unrelated bunch of voxels) is easily obtained by the use of the alpha shapes theory", "keyphrases": ["geometry", "topology", "volumetric datasets", "biological structures", "database framework", "vector quantization algorithm", "low-resolution density maps", "biological macromolecules", "modeling", "probability density function", "data representation", "structural features", "cavities", "channels", "connectivity", "compact representation", "three-dimensional points", "medium resolution data", "general model", "original volume data", "alpha shapes theory"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1748", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. I. Lamina theory", "abstract": "It is well known that a structural design with isotropic materials can only be accomplished based on a stress failure criterion. This is, however, generally not true with laminated composites. Only when the laminate is subjected to an in-plane load, can the ultimate failure of the laminate correspond to its last-ply failure, and hence a stress failure criterion may be sufficient to detect the maximum load that can be sustained by the laminate. Even in such a case, the load shared by each lamina in the laminate cannot be correctly determined if the lamina instantaneous stiffness matrix is inaccurately provided, since the lamina is always statically indeterminate in the laminate. If, however, the laminate is subjected to a lateral load, its ultimate failure occurs before last-ply failure and use of the stress failure criterion is no longer sufficient; an additional critical deflection or curvature condition must also be employed. This necessitates development of an efficient constitutive relationship for laminated composites in order that the laminate strains/deflections up to ultimate failure can be accurately calculated. A general constitutive description for the thermomechanical response of a fibrous laminate up to ultimate failure with applications to various fibrous laminates is presented in the two papers. The constitutive relationship is obtained by combining classical lamination theory with a recently developed bridging micromechanics model, through a layer-by-layer analysis. This paper focuses on lamina analysis", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "lamina theory", "structural design", "isotropic materials", "stress failure criterion", "in-plane load", "instantaneous stiffness matrix", "lateral load", "last-ply failure", "critical deflection condition", "critical curvature condition", "composites", "laminate strains", "laminate deflections", "thermomechanical response", "layer-by-layer analysis", "micromechanics model", "multidirectional tape laminae", "woven fabric composites", "braided fabric composites", "knitted fabric reinforced composites", "elastoplasticity", "elastic-viscoplasticity"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P", "M", "R", "P", "P", "P", "M", "M", "M", "M", "U", "U"]} +{"id": "1570", "title": "Self-reproduction in three-dimensional reversible cellular space", "abstract": "Due to inevitable power dissipation, it is said that nano-scaled computing devices should perform their computing processes in a reversible manner. This will be a large problem in constructing three-dimensional nano-scaled functional objects. Reversible cellular automata (RCA) are used for modeling physical phenomena such as power dissipation, by studying the dissipation of garbage signals. We construct a three-dimensional self-inspective self-reproducing reversible cellular automaton by extending the two-dimensional version SR/sub 8/. It can self-reproduce various patterns in three-dimensional reversible cellular space without dissipating garbage signals", "keyphrases": ["self-reproduction", "nano-scaled computing devices", "power dissipation", "3D self-inspective self-reproducing cellular automata", "reversible cellular automata", "artificial life", "three-dimensional reversible cellular space"], "prmu": ["P", "P", "P", "M", "P", "U", "P"]} +{"id": "1535", "title": "Hot controllers", "abstract": "Over the last few years, the semiconductor industry has put much emphasis on ways to improve the accuracy of thermal mass flow controllers (TMFCs). Although issues involving TMFC mounting orientation and pressure effects have received much attention, little has been done to address the effect of changes in ambient temperature or process gas temperature. Scientists and engineers at Qualiflow have succeeded to solve the problem using a temperature correction algorithm for digital TMFCs. Using an in situ environmental temperature compensation technique, we calculated correction factors for the temperature effect and obtained satisfactory results with both the traditional sensor and the new, improved thin-film sensors", "keyphrases": ["semiconductor manufacturing", "process gas flow", "thermal mass flow controller", "temperature correction algorithm", "in situ environmental temperature compensation"], "prmu": ["M", "R", "P", "P", "P"]} +{"id": "160", "title": "Taming the paper tiger [paperwork organization]", "abstract": "Generally acknowledged as a critical problem for many information professionals, the massive flow of documents, paper trails, and information needs efficient and dependable approaches for processing and storing and finding items and information", "keyphrases": ["paperwork organization", "information professionals", "information processing", "information storage", "information retrieval"], "prmu": ["P", "P", "R", "M", "M"]} +{"id": "1671", "title": "Cane railway scheduling via constraint logic programming: labelling order and constraints in a real-life application", "abstract": "In Australia, cane transport is the largest unit cost in the manufacturing of raw sugar, making up around 35% of the total manufacturing costs. Producing efficient schedules for the cane railways can result in significant cost savings. The paper presents a study using constraint logic programming (CLP) to solve the cane transport scheduling problem. Tailored heuristic labelling order and constraints strategies are proposed and encouraging results of application to several test problems and one real-life case are presented. The preliminary results demonstrate that CLP can be used as an effective tool for solving the cane transport scheduling problem, with a potential decrease in development costs of the scheduling system. It can also be used as an efficient tool for rescheduling tasks which the existing cane transport scheduling system cannot perform well", "keyphrases": ["cane railway scheduling", "constraint logic programming", "cane transport", "raw sugar", "total manufacturing costs", "cost savings", "heuristic labelling order", "constraints strategies"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1634", "title": "Maple 8 keeps everyone happy", "abstract": "The author is impressed with the upgrade to the mathematics package Maple 8, finding it genuinely useful to scientists and educators. The developments Waterloo Maple class as revolutionary include a student calculus package, and Maplets. The first provides a high-level command set for calculus exploration and plotting (removing the need to work with, say, plot primitives). The second is a package for hand-coding custom graphical user interfaces (GUIs) using elements such as check boxes, radio buttons, slider bars and pull-down menus. When called, a Maplet launches a runtime Java environment that pops up a window-analogous to a Java applet-to perform a programmed routine, if required passing the result back to the Maple worksheet", "keyphrases": ["Maple 8 mathematics package", "student calculus package", "high-level command set", "calculus exploration", "calculus plotting", "GUIs", "Maplet", "runtime Java environment"], "prmu": ["R", "P", "P", "P", "R", "P", "P", "P"]} +{"id": "1729", "title": "Maintaining e-commerce", "abstract": "E-commerce over the Web has created a relatively new type of information system. So it is hardly surprising that little attention has been given to the maintenance of such systems-and even less to attempting to develop them with future maintenance in mind. But there are various ways e-commerce systems can be developed to reduce future maintenance", "keyphrases": ["e-commerce systems maintenance", "Web systems"], "prmu": ["R", "R"]} +{"id": "1847", "title": "Conceptual modeling and specification generation for B2B business processes based on ebXML", "abstract": "In order to support dynamic setup of business processes among independent organizations, a formal standard schema for describing the business processes is basically required. The ebXML framework provides such a specification schema called BPSS (Business Process Specification Schema) which is available in two standalone representations: a UML version, and an XML version. The former, however, is not intended for the direct creation of business process specifications, but for defining specification elements and their relationships required for creating an ebXML-compliant business process specification. For this reason, it is very important to support conceptual modeling that is well organized and directly matched with major modeling concepts. This paper deals with how to represent and manage B2B business processes using UML-compliant diagrams. The major challenge is to organize UML diagrams in a natural way that is well suited to the business process meta-model and then to transform the diagrams into an XML version. This paper demonstrates the usefulness of conceptually modeling business processes by prototyping a business process editor tool called ebDesigner", "keyphrases": ["B2B business processes", "ebXML", "conceptual modeling", "specification generation", "formal standard schema", "Business Process Specification Schema", "UML-compliant diagrams", "meta model", "ebDesigner", "business process editor"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M", "P", "P"]} +{"id": "1802", "title": "Novel TCP congestion control scheme and its performance evaluation", "abstract": "A novel self-tuning proportional and derivative (ST-PD) control based TCP congestion control scheme is proposed. The new scheme approaches the congestion control problem from a control-theoretical perspective and overcomes several Important limitations associated with existing TCP congestion control schemes, which are heuristic based. In the proposed scheme, a PD controller is employed to keep the buffer occupancy of the bottleneck node on the connection path at an ideal operating level, and it adjusts the TCP window accordingly. The control gains of the PD controller are tuned online by a fuzzy logic controller based on the perceived bandwidth-delay product of the TCP connection. This scheme gives ST-PD TCP several advantages over current TCP implementations. These include rapid response to bandwidth variations, insensitivity to buffer sizes, and significant improvement of TCP throughput over lossy links by decoupling congestion control and error control functions of TCP", "keyphrases": ["TCP congestion control scheme", "performance evaluation", "self-tuning proportional-derivative control", "control-theoretical perspective", "PD controller", "buffer occupancy", "bottleneck node", "connection path", "fuzzy logic controller", "bandwidth-delay product", "lossy links"], "prmu": ["P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1490", "title": "Client satisfaction in a feasibility study comparing face-to-face interviews with telepsychiatry", "abstract": "We carried out a pilot study comparing satisfaction levels between psychiatric patients seen face to face (FTF) and those seen via videoconference. Patients who consented were randomly assigned to one of two groups. One group received services in person (FTF from the visiting psychiatrist) while the other was seen using videoconferencing at 128 kbit/s. One psychiatrist provided all the FTF and videoconferencing assessment and follow-up visits. A total of 24 subjects were recruited. Three of the subjects (13%) did not attend their appointments and two subjects in each group were lost to follow-up. Thus there were nine in the FTF group and eight in the videoconferencing group. The two groups were similar in most respects. Patient satisfaction with the services was assessed using the Client Satisfaction Questionnaire (CSQ-8), completed four months after the initial consultation. The mean scores were 25.3 in the FTF group and 21.6 in the videoconferencing group. Although there was a trend in favour of the FTF service, the difference was not significant. Patient satisfaction is only one component of evaluation. The efficacy of telepsychiatry must also be measured relative to that of conventional, FTF care before policy makers can decide how extensively telepsychiatry should be implemented", "keyphrases": ["client satisfaction", "face-to-face interviews", "telepsychiatry", "psychiatric patient satisfaction", "human factors", "videoconference", "Client Satisfaction Questionnaire", "telemedicine", "128 kbit/s"], "prmu": ["P", "P", "P", "R", "U", "P", "P", "U", "P"]} +{"id": "1791", "title": "The pedagogy of on-line learning: a report from the University of the Highlands and Islands Millennium Institute", "abstract": "Authoritative sources concerned with computer-aided learning, resource-based learning and on-line learning and teaching are generally agreed that, in addition to subject matter expertise and technical support, the quality of the learning materials and the learning experiences of students are critically dependent on the application of pedagogically sound theories of learning and teaching and principles of course design. The University of the Highlands and Islands Project (UHIMI) is developing \"on-line learning\" on a large scale. These developments have been accompanied by a comprehensive programme of staff development. A major emphasis of the programme is concerned with ensuring that course developers and tutors are pedagogically aware. This paper reviews (i) what is meant by \"on-line learning\" in the UHIMI context (ii) the theories of learning and teaching and principles of course design that inform the staff development programme and (iii) a review of progress to date", "keyphrases": ["online learning", "pedagogy", "computer-aided learning", "resource-based learning", "teaching", "technical support", "educational course design", "distance education", "Internet", "University of the Highlands and Islands Project", "staff development"], "prmu": ["M", "P", "P", "P", "P", "P", "M", "U", "U", "P", "P"]} +{"id": "1887", "title": "Doubly invariant equilibria of linear discrete-time games", "abstract": "The notion of doubly invariant (DI) equilibrium is introduced. The concept extends controlled and robustly controlled invariance notions to the context of two-person dynamic games. Each player tries to keep the state in a region of state space independently of the actions of the rival player. The paper gives existence conditions, criteria and algorithms for the determination of DI equilibria of linear dynamic games in discrete time. Two examples illustrate the results. The first one is in the area of fault-tolerant controller synthesis. The second is an application to macroeconomics", "keyphrases": ["doubly invariant equilibria", "linear discrete-time games", "robustly controlled invariance", "two-person dynamic games", "state space", "existence conditions", "fault-tolerant controller synthesis", "macroeconomics"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1714", "title": "Hordes: a multicast based protocol for anonymity", "abstract": "With widespread acceptance of the Internet as a public medium for communication and information retrieval, there has been rising concern that the personal privacy of users can be eroded by cooperating network entities. A technical solution to maintaining privacy is to provide anonymity. We present a protocol for initiator anonymity called Hordes, which uses forwarding mechanisms similar to those used in previous protocols for sending data, but is the first protocol to make use of multicast routing to anonymously receive data. We show this results in shorter transmission latencies and requires less work of the protocol participants, in terms of the messages processed. We also present a comparison of the security and anonymity of Hordes with previous protocols, using the first quantitative definition of anonymity and unlinkability. Our analysis shows that Hordes provides anonymity in a degree similar to that of Crowds and Onion Routing, but also that Hordes has numerous performance advantages", "keyphrases": ["Hordes", "protocol", "Internet", "personal privacy", "cooperating network entities", "initiator anonymity", "forwarding mechanisms", "multicast routing", "transmission latencies", "unlinkability", "Crowds", "Onion Routing", "performance"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1751", "title": "An adaptive time step procedure for a parabolic problem with blow-up", "abstract": "In this paper we introduce and analyze a fully discrete approximation for a parabolic problem with a nonlinear boundary condition which implies that the solutions blow up in finite time. We use standard linear elements with mass lumping for the space variable. For the time discretization we write the problem in an equivalent form which is obtained by introducing an appropriate time re-scaling and then, we use explicit Runge-Kutta methods for this equivalent problem. In order to motivate our procedure we present it first in the case of a simple ordinary differential equation and show how the blow up time is approximated in this case. We obtain necessary and sufficient conditions for the blowup of the numerical solution and prove that the numerical blow-up time converges to the continuous one. We also study, for the explicit Euler approximation, the localization of blow-up points for the numerical scheme", "keyphrases": ["adaptive time step procedure", "parabolic problem", "fully discrete approximation", "nonlinear boundary condition", "standard linear elements", "Runge-Kutta methods", "explicit Euler approximation"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} +{"id": "1609", "title": "Modeling undesirable factors in efficiency evaluation", "abstract": "Data envelopment analysis (DEA) measures the relative efficiency of decision making units (DMUs) with multiple performance factors which are grouped into outputs and inputs. Once the efficient frontier is determined, inefficient DMUs can improve their performance to reach the efficient frontier by either increasing their current output levels or decreasing their current input levels. However, both desirable (good) and undesirable (bad) factors may be present. For example, if inefficiency exists in production processes where final products are manufactured with a production of wastes and pollutants, the outputs of wastes and pollutants are undesirable and should be reduced to improve the performance. Using the classification invariance property, we show that the standard DEA model can be used to improve the performance via increasing the desirable outputs and decreasing the undesirable outputs. The method can also be applied to situations when some inputs need to be increased to improve the performance. The linearity and convexity of DEA are preserved through our proposal", "keyphrases": ["data envelopment analysis", "decision making units", "multiple performance factors", "efficient frontier", "current output levels", "current input levels", "production processes", "final product manufacture", "wastes", "pollutants", "classification invariance property", "desirable outputs", "undesirable outputs", "linear programming", "efficiency evaluation", "undesirable factor modeling"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P", "P", "M", "P", "R"]} +{"id": "1922", "title": "Trends in agent communication language", "abstract": "Agent technology is an exciting and important new way to create complex software systems. Agents blend many of the traditional properties of AI programs - knowledge-level reasoning, flexibility, proactiveness, goal-directedness, and so forth - with insights gained from distributed software engineering, machine learning, negotiation and teamwork theory, and the social sciences. An important part of the agent approach is the principle that agents (like humans) can function more effectively in groups that are characterized by cooperation and division of labor. Agent programs are designed to autonomously collaborate with each other in order to satisfy both their internal goals and the shared external demands generated by virtue of their participation in agent societies. This type of collaboration depends on a sophisticated system of inter-agent communication. The assumption that inter-agent communication is best handled through the explicit use of an agent communication language (ACL) underlies each of the articles in this special issue. In this introductory article, we will supply a brief background and introduction to the main topics in agent communication", "keyphrases": ["agent technology", "AI programs", "agent communication language", "inter-agent communication", "agent societies", "KQML", "semantics", "conversations", "distributed software engineering", "machine learning", "negotiation", "teamwork", "social sciences"], "prmu": ["P", "P", "P", "P", "P", "U", "U", "U", "P", "P", "P", "P", "P"]} +{"id": "1508", "title": "Rats, robots, and rescue", "abstract": "In early May, media inquiries started arriving at my office at the Center for Robot-Assisted Search and Rescue (www.crasar.org). Because I'm CRASAR's director, I thought the press was calling to follow up on the recent humanitarian award given to the center's founder, John Blitch, for successfully using small, backpackable robots at the World Trade Center disaster. Instead, I found they were asking me to comment on the \"roborats\" study in the 2 May 2002 Nature. In this study, rats with medial force brain implants underwent operant conditioning to force them into a form of guided behavior, one aspect of which was thought useful for search and rescue. The article's closing comment suggested that a guided rat could serve as both a mobile robot and a biological sensor. Although a roboticist by training, I'm committed to any technology that will help save lives while reducing the risk to rescuers. But rats?", "keyphrases": ["mobile robot", "biological sensor", "guided rat", "robot-assisted search and rescue"], "prmu": ["P", "P", "P", "P"]} \ No newline at end of file