id
stringlengths
1
4
title
stringlengths
13
200
abstract
stringlengths
67
2.93k
keyphrases
sequence
prmu
sequence
968
Stabilization of global invariant sets for chaotic systems: an energy based control approach
This paper presents a new control approach for steering trajectories of three-dimensional nonlinear chaotic systems towards stable stationary states or time-periodic orbits. The proposed method mainly consists in a sliding mode-based control design that is extended by an explicit consideration of system energy as basis for both controller design and system stabilization. The control objective is then to regulate the energy with respect to a shaped nominal representation implicitly related to system trajectories. In this paper, we establish some theoretical results to introduce the control design approach referred to as energy based sliding mode control. Then, some capabilities of the proposed approach are illustrated through examples related to the chaotic circuit of Chua
[ "global invariant sets", "three-dimensional nonlinear chaotic systems", "stable stationary states", "time-periodic orbits", "sliding mode-based control", "energy based sliding mode control", "Chua's circuit" ]
[ "P", "P", "P", "P", "P", "P", "M" ]
1216
Knowledge flow management for distributed team software development
Cognitive cooperation is often neglected in current team software development processes. This issue becomes more important than ever when team members are globally distributed. This paper presents a notion of knowledge flow and the related management mechanism for realizing an ordered knowledge sharing and cognitive cooperation in a geographically distributed team software development process. The knowledge flow can carry and accumulate knowledge when it goes through from one team member to another. The coordination between the knowledge flow process and the workflow process of a development team provides a new way to improve traditional team software development processes. A knowledge grid platform has been implemented to support the knowledge flow management across the Internet
[ "knowledge flow management", "distributed team software development", "cognitive cooperation", "ordered knowledge sharing", "workflow process", "knowledge grid platform", "Internet", "knowledge flow representation", "software development management", "cooperative work" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "R", "M" ]
1253
Application of XML for neural network exchange
This article introduces a framework for the interchange of trained neural network models. An XML-based language (Neural Network Markup Language) for the neural network model description is offered. It allows to write down all the components of neural network model which are necessary for its reproduction. We propose to use XML notation for the full description of neural models, including data dictionary, properties of training sample, preprocessing methods, details of network structure and parameters and methods for network output interpretation
[ "XML", "neural network exchange", "neural network markup language", "data dictionary", "preprocessing methods", "network structure", "network output interpretation" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
606
Taiwan power company phases into AM/FM
To face the challenges and impact of the inevitable trend toward privatization and deregulation, the Taiwan Power Co. (TPC) devised short- and long-term strategic computerization development plans. These development efforts created a master plan that included building an Automated Mapping and Facilities Management (AM/ FM) system for Taipei City District Office (TCDO). This project included a pilot project followed by evaluation before the roll out to the complete service territory of TCDO. The pilot project took three years to install, commission and-via the evaluation process-reach the conclusion that AM/FM was technologically feasible
[ "Taiwan Power Company", "AM/FM", "privatization", "deregulation", "Automated Mapping and Facilities Management", "Taipei City District Office", "pilot project", "complete service territory" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
79
An efficient and stable ray tracing algorithm for parametric surfaces
In this paper, we propose an efficient and stable algorithm for finding ray-surface intersections. Newton's method and Bezier clipping are adapted to form the core of our algorithm. Ray coherence is used to find starting points for Newton iteration. We introduce an obstruction detection technique to verify whether an intersection point found using Newton's method is the closest. When Newton's method fails to achieve convergence, we use Bezier clipping substitution to find the intersection points. This combination achieves a significant improvement in tracing primary rays. A similar approach successfully improves the performance of tracing secondary rays
[ "parametric surfaces", "ray-surface intersections", "Bezier clipping", "ray coherence", "Newton iteration", "obstruction detection technique", "convergence", "efficient stable ray tracing algorithm", "Newton method", "primary ray tracing", "secondary ray tracing" ]
[ "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R" ]
643
Time-resolved contrast-enhanced imaging with isotropic resolution and broad coverage using an undersampled 3D projection trajectory
Time-resolved contrast-enhanced 3D MR angiography (MRA) methods have gained in popularity but are still limited by the tradeoff between spatial and temporal resolution. A method is presented that greatly reduces this tradeoff by employing undersampled 3D projection reconstruction trajectories. The variable density k-space sampling intrinsic to this sequence is combined with temporal k-space interpolation to provide time frames as short as 4 s. This time resolution reduces the need for exact contrast timing while also providing dynamic information. Spatial resolution is determined primarily by the projection readout resolution and is thus isotropic across the FOV, which is also isotropic. Although undersampling the outer regions of k-space introduces aliased energy into the image, which may compromise resolution, this is not a limiting factor in high-contrast applications such as MRA. Results from phantom and volunteer studies are presented demonstrating isotropic resolution, broad coverage with an isotropic field of view (FOV), minimal projection reconstruction artifacts, and temporal information. In one application, a single breath-hold exam covering the entire pulmonary vasculature generates high-resolution, isotropic imaging volumes depicting the bolus passage
[ "time-resolved contrast-enhanced imaging", "isotropic resolution", "broad coverage", "undersampled 3D projection trajectory", "variable density k-space sampling", "temporal k-space interpolation", "isotropic field of view", "pulmonary vasculature", "bolus passage", "3D MRI angiography", "abdomen", "thorax", "image artifacts", "breath-hold imaging" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "U", "U", "R", "R" ]
1426
Groove Networks. Matching technology with human needs
If what has been occurring in information technology during the past decade or so can be classified as the "Information Age," then going forward, I believe it's going to be viewed more as the "connection age"," says Ray Ozzie, CEO and chairman of Groove Networks, the Beverly, Massachusetts company that produces collaboration software. "We're all going to be thinking more about the connections between people and the connections between companies," Ozzie says. "Our mission has two parts: to help businesses achieve a greater "return on connection" from their relationships with customers, vendors, and partners; and to help individuals strengthen online connections with the people with whom they interact."
[ "Groove Networks", "businesses", "online connections", "server products", "organizational perspective", "personal perspective", "online collaboration", "knowledge work", "collaborative technologies", "inking technology" ]
[ "P", "P", "P", "U", "U", "U", "R", "U", "R", "M" ]
830
The real story behind Calpoint [telecom]
A former Qwest executive sheds light on the carrier's controversial deal with Calpoint. Discusses why Calpoint gets a monthly check from Quest, regardless of whether it provides services
[ "Calpoint", "Qwest", "telecom carrier" ]
[ "P", "P", "R" ]
875
Women of color in computing
It is well known that there is a need to increase the number of women in the area of computing, that is in computer science and computer engineering. If we consider women of color, that is women of under-represented ethnicities, we find the numbers are very dismal. The goal of this article is to bring to light the unique issues of women of color based upon the personal experience of one African-American woman who has been in the field of computing for over 20 years (including the years of higher education)
[ "women of color", "computer science", "computer engineering", "higher education", "ethnic minority", "society", "gender issues" ]
[ "P", "P", "P", "P", "M", "U", "M" ]
888
Storage functionals and Lyapunov functions for passive dynamical systems
For nonlinear time-invariant input-output dynamical systems the passivity conditions are obtained under some restrictions. The conditions imply storage functions satisfying a dissipation inequality. A class of storage functions allowing unique reconstruction of a passive dynamical system is defined. These results are illustrated by an example of a linear system with fading memory. An important, for practical application, class of the linear relaxation systems without direct input-output interaction is considered. A necessary condition for dynamical systems to be of the relaxation type is obtained for this class. The condition is connected with the existence of a unique quadratic Lyapunov function satisfying the complete monotonicity condition. This unique Lyapunov function corresponds to a "standard" thermodynamic potential in a compact family of potentials in the nonequilibrium thermodynamics. The results obtained can be useful in automatic control, mechanics of viscoelastic materials, and various applications in physics and the system theory
[ "storage functionals", "passive dynamical systems", "nonlinear time-invariant input-output dynamical systems", "passivity conditions", "dissipation inequality", "linear system", "fading memory", "necessary condition", "unique quadratic Lyapunov function", "complete monotonicity condition", "thermodynamic potential", "nonequilibrium thermodynamics", "automatic control", "mechanics", "viscoelastic materials" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1072
Quantum-state information retrieval in a Rydberg-atom data register
We analyze a quantum search protocol to retrieve phase information from a Rydberg-atom data register using a subpicosecond half-cycle electric field pulse. Calculations show that the half-cycle pulse can perform the phase retrieval only within a range of peak field values. By varying the phases of the constituent orbitals of the Rydberg wave packet register, we demonstrate coherent control of the phase retrieval process. By specially programming the phases of the orbitals comprising the initial wave packet, we show that it is possible to use the search method as a way to synthesize single energy eigenstates
[ "quantum-state information retrieval", "Rydberg-atom data register", "quantum search protocol", "phase information", "subpicosecond half-cycle electric field pulse", "half-cycle pulse", "phase retrieval", "peak field values", "constituent orbitals", "Rydberg wave packet register", "coherent control", "initial wave packet", "search method", "single energy eigenstates" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1037
A stochastic averaging approach for feedback control design of nonlinear systems under random excitations
This paper presents a method for designing and quantifying the performance of feedback stochastic controls for nonlinear systems. The design makes use of the method of stochastic averaging to reduce the dimension of the state space and to derive the Ito stochastic differential equation for the response amplitude process. The moment equation of the amplitude process closed by the Rayleigh approximation is used as a means to characterize the transient performance of the feedback control. The steady state and transient response of the amplitude process are used as the design criteria for choosing the feedback control gains. Numerical examples are studied to demonstrate the performance of the control
[ "stochastic averaging", "feedback control", "nonlinear systems", "random excitations", "feedback stochastic controls", "Ito stochastic differential equation", "Rayleigh approximation", "steady state", "transient response" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
848
Women in computing around the world
This paper describes the participation of women in computing in more than 30 countries, by focussing on participation at undergraduate level. A brief discussion covers how societal and cultural factors may affect women's participation. Statistics from many different sources are presented for comparison. Generally, participation is low - most countries fall in the 10-40% range with a few below 10% and a few above 40%
[ "women", "cultural factors", "statistics", "undergraduate computing", "societal factors" ]
[ "P", "P", "P", "R", "R" ]
763
A quantum full adder for a scalable nuclear spin quantum computer
We demonstrate a strategy for implementation a quantum full adder in a spin chain quantum computer. As an example, we simulate a quantum full adder in a chain containing 201 spins. Our simulations also demonstrate how one can minimize errors generated by non-resonant effects
[ "quantum full adder", "scalable nuclear spin quantum computer", "nonresonant effects", "error minimization" ]
[ "P", "P", "M", "R" ]
726
New wrinkle on the Web? Hmm. [banking]
The financial sector produced its share of technology hype during the new economy years. You. can't blame folks if the next next thing, a wave of Internet-related innovation called Web services, is being met with healthy skepticism. Many gurus are placing their bets on Web services to drive the next chapter of finance technology, dramatically upgrading disappointing automated customer management strategies by electronically breaking down barriers between products, firms and customers, and perhaps creating a whole new line of business in the process. But it's not a magic wand. It doesn't change the need for a bank to reorganize and streamline its operations
[ "bank", "Web services" ]
[ "P", "P" ]
1373
Agent-based product-support logistics system using XML and RDF
The capability of the timely provision of maintenance services and service parts is critical to the competitiveness of industrial systems. To enhance the timely operations in a product-support logistics chain, business partners (equipment manufacturers, parts distributors, customers) may have to collaborate for the efficient exchange of relevant information. We propose the architecture of an agent-based product-support logistics system. Emphasis is placed on the problems of sharing and exchanging information through agent communication. We adopt the Resource Description Framework (RDF) schema for information modelling in product-support logistics domain. The eXtensible Markup Language (XML) serialization generates messages for agent communication. The use of XML and RDF enables software agents to understand the contents of messages correctly and consistently. We demonstrate the feasibility of our agent architecture using a scenario in logistical support processes. We believe that the approach can provide a promising way to the automation of business processes in product-support logistics through seamless communication among the partners
[ "agent-based product-support logistics system", "XML", "RDF", "maintenance services", "service parts", "industrial systems", "information modelling", "eXtensible Markup Language", "software agents", "Resource Description Framework schema" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
1336
On abelian branched coverings of the sphere
We obtain an enumeration formula for the number of weak equivalence classes of the branched (A * B)-covering of the sphere with m-branch points, when A and B are finite abelian groups with (|A|, |B|) = 1. From this, we can deduce an explicit formula for enumerating the weak equivalence classes of pseudofree spherical (Zp * Zq)-actions on a given surface, when p and q are distinct primes
[ "Abelian branched coverings", "enumeration formula", "weak equivalence classes", "finite abelian groups", "explicit formula", "pseudofree spherical" ]
[ "P", "P", "P", "P", "P", "P" ]
116
Frontier between separability and quantum entanglement in a many spin system
We discuss the critical point x/sub c/ separating the quantum entangled and separable states in two series of N spins S in the simple mixed state characterized by the matrix operator rho = x| phi >< phi |+1-x/D/sup N/I/sub D/N, where x in [0, 1], D = 2S + 1, I/sub D/N is the D/sup N/ * D/sup N/ unity matrix and | phi > is a special entangled state. The cases x = 0 and x = 1 correspond respectively to fully random spins and to a fully entangled state. In the first of these series we consider special states | phi > invariant under charge conjugation, that generalizes the N = 2 spin S = 1/2 Einstein-Podolsky-Rosen state, and in the second one we consider generalizations of the Werner (1989) density matrices. The evaluation of the critical point x/sub c/ was done through bounds coming from the partial transposition method of Peres (1996) and the conditional nonextensive entropy criterion. Our results suggest the conjecture that whenever the bounds coming from both methods coincide the result of x/sub c/ is the exact one. The results we present are relevant for the discussion of quantum computing, teleportation and cryptography
[ "separability", "quantum entanglement", "many spin system", "critical point", "separable states", "matrix operator", "unity matrix", "entangled state", "random spin", "charge conjugation", "Einstein-Podolsky-Rosen state", "partial transposition method", "nonextensive entropy criterion", "quantum computing", "teleportation", "cryptography", "Werner density matrices" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
955
From the DOS dog days to e-filing [law firms]
The poster child for a successful e-filing venture is the Case Management and Electronic Case File system now rolling through the district and bankruptcy courts. A project of the Administrative Office of the United States Courts, CM/ECF is a loud proponent of the benefits of the PDF approach and it has a full head of steam. Present plans are for all federal courts to implement CM/ECF by 2005. That means a radical shift in methodology and tools for a lot of lawyers. It also means that you should get cozy with Acrobat real soon
[ "e-filing", "Case Management and Electronic Case File system", "United States Courts", "PDF", "Adobe Acrobat" ]
[ "P", "P", "P", "P", "M" ]
910
Control of a heavy-duty robotic excavator using time delay control with integral sliding surface
The control of a robotic excavator is difficult from the standpoint of the following problems: parameter variations in mechanical structures, various nonlinearities in hydraulic actuators and disturbance due to the contact with the ground. In addition, the more the size of robotic excavators increases, the more the length and mass of the excavator links; the more the parameters of a heavy-duty excavator vary. A time-delay control with switching action (TDCSA) using an integral sliding surface is proposed in this paper for the control of a 21-ton robotic excavator. Through analysis and experiments, we show that using an integral sliding surface for the switching action of TDCSA is better than using a PD-type sliding surface. The proposed controller is applied to straight-line motions of a 21-ton robotic excavator with a speed level at which skillful operators work. Experiments, which were designed for surfaces with various inclinations and over broad ranges of joint motions, show that the proposed controller exhibits good performance
[ "robotic excavator", "integral sliding surface", "time-delay control", "robust control", "motion control", "trajectory control", "dynamics", "tracking", "pressure control" ]
[ "P", "P", "P", "M", "R", "M", "U", "U", "M" ]
582
Optimal estimation of a finite sample of a discrete chaotic process
The synthesis of optimal algorithms for estimating discrete chaotic processes specified by a finite sample is considered; various possible approaches are discussed. Expressions determining the potential accuracy in estimating a single value of the chaotic process are derived. An example of the application of the general equations obtained is given
[ "optimal estimation", "finite sample", "discrete chaotic process", "optimal algorithm synthesis", "space-time filtering" ]
[ "P", "P", "P", "R", "U" ]
1192
Construction of two-sided bounds for initial-boundary value problems
This paper extends the bounding operator approach developed for boundary value problems to the case of initial-boundary value problems (IBVPs). Following the general principle of bounding operators enclosing methods for the case of partial differential equations are discussed. In particular, continuous discretization methods with an appropriate error bound controlled shift and monotone extensions of Rothe's method for parabolic problems are investigated
[ "two-sided bounds", "initial-boundary value problems", "bounding operator approach", "bounding operators", "partial differential equations", "parabolic problems" ]
[ "P", "P", "P", "P", "P", "P" ]
683
Knowledge management
The article defines knowledge management, discusses its role, and describes its functions. It also explains the principles of knowledge management, enumerates the strategies involved in knowledge management, and traces its history in brief. The focus is on its interdisciplinary nature. The steps involved in knowledge management i.e. identifying, collecting and capturing, selecting, organizing and storing, sharing, applying, and creating, are explained. The pattern of knowledge management initiatives is also considered
[ "knowledge management" ]
[ "P" ]
1293
Truss topology optimization by a modified genetic algorithm
This paper describes the use of a stochastic search procedure based on genetic algorithms for developing near-optimal topologies of load-bearing truss structures. Most existing cases these publications express the truss topology as a combination of members. These methods, however, have the disadvantage that the resulting topology may include needless members or those which overlap other members. In addition to these problems, the generated structures are not necessarily structurally stable. A new method, which resolves these problems by expressing the truss topology as a combination of triangles, is proposed in this paper. Details of the proposed methodology are presented as well as the results of numerical examples that clearly show the effectiveness and efficiency of the method
[ "truss topology optimization", "modified genetic algorithm", "stochastic search procedure", "near-optimal topologies", "load-bearing truss structures", "triangles" ]
[ "P", "P", "P", "P", "P", "P" ]
1422
Taxonomy's role in content management
A taxonomy is simply a way of classifying things. Still, there is a rapidly growing list of vendors offering taxonomy software and related applications. They promise many benefits, especially to enterprise customers: Content management will be more efficient. Corporate portals will be enhanced by easily created Yahoo!-like directories of internal information. And the end-user experience will be dramatically improved by more successful content retrieval and more effective knowledge discovery. But today's taxonomy products represent emerging technologies. They are not out-of-the-box solutions. And even the most automated systems require some manual assistance from people who know how to classify content
[ "content management", "taxonomy software", "enterprise customers", "corporate portals", "internal information", "effective knowledge discovery", "taxonomy applications" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
834
Commerce Department plan eases 3G spectrum crunch
The federal government made its first move last week toward cleaning up a spectrum allocation system that was in shambles just a year ago and had some, spectrum-starved wireless carriers fearing they wouldn't be able to compete in third-generation services. The move, however, is far from complete and leaves numerous details unsettled
[ "3G spectrum", "federal government", "spectrum allocation system", "wireless carriers" ]
[ "P", "P", "P", "P" ]
871
Priming the pipeline [women in computer science careers]
In 1997 The Backyard Project, a pilot program of the Garnett Foundation, was instituted to encourage high school girls to explore careers in the computer industry. At that time, the Garnett Foundation commissioned the Global Strategy Group to execute a survey of 652 college-bound high school students (grades 9 through 12), to help discover directions that The Backyard Project might take to try to move toward the mission of the pilot program. It conducted the study by telephone between March 25 and April 8, 1997 in the Silicon Valley, Boston, and Austin metropolitan areas. It conducted all interviews using a random digit dialing methodology, derived from a file of American households with high incidences of adolescent children. The top six answers from girls to the survey question "why are girls less likely to pursue computer science careers?" in order of perceived importance by the girls were: not enough role models; women have other interests; didn't know about the industry; limited opportunity; negative media; and too nerdy. These responses are discussed
[ "The Backyard Project", "high school girls", "college-bound high school students", "computer industry careers" ]
[ "P", "P", "P", "R" ]
929
Closed loop finite-element modeling of active constrained layer damping in the time domain analysis
A three-dimensional finite-element closed-loop model has been developed to predict the effects of active-passive damping on a vibrating structure. The Golla-Hughes-McTavish method is employed to capture the viscoelastic material behavior in a time domain analysis. The parametric study includes the different control gains as well as geometric parameters related to the active constrained layer damping (ACLD) treatment. Comparisons are made among several ACLD models, the passive constrained model and the active damping model. The results obtained here reiterate that ACLD is somewhat better for vibration suppression than either the purely passive or the active system and provides higher structural damping with less control gain when compared to the purely active system. Since the ACLD performance can be reduced by the viscoelastic layer, the design of the ACLD model must be given a careful consideration in order to optimize the effect of passive damping
[ "active constrained layer damping", "time domain analysis", "three-dimensional finite-element closed-loop model", "Golla-Hughes-McTavish method", "viscoelastic material", "ACLD models", "passive constrained model", "active damping model", "vibration suppression", "structural damping", "viscoelastic layer", "passive damping" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1212
TCRM: diagnosing tuple inconsistency for granulized datasets
Many approaches to granularization have been presented for knowledge discovery. However, the inconsistent tuples that exist in granulized datasets are hardly ever revealed. We developed a model, tuple consistency recognition model (TCRM) to help efficiently detect inconsistent tuples for datasets that are granulized. The main outputs of the developed model include explored inconsistent tuples and consumed processing time. We further conducted an empirical test where eighteen continuous real-life datasets granulized by the equal width interval technique that embedded S-plus histogram binning algorithm (SHBA) and largest binning size algorithm (LBSA) binning algorithms were diagnosed. Remarkable results: almost 40% of the granulized datasets contain inconsistent tuples and 22% have the amount of inconsistent tuples more than 20%
[ "TCRM", "tuple inconsistency", "granulized datasets", "granularization", "knowledge discovery", "tuple consistency recognition model", "processing time", "equal width interval technique", "S-plus histogram binning algorithm", "largest binning size algorithm", "relational database", "large database", "SQL" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "U", "U" ]
1257
Definition of a similarity measure between cases based on auto/cross-fuzzy thesauri
A similarity measure between cases is needed in order to evaluate the degree of similarity when using past similar cases in order to resolve current problems. In similar case retrieval, multiple indices are set up in order to characterize the queries and individual cases, then terms are given as values to each. The similarity measure between cases commonly used is defined using the rate at which the values provided from the corresponding indices match. In practice, however, values cannot be expected to be mutually exclusive. As a result, a natural expansion of this approach is to have relationships in which mutually similar meanings are reflected in the similarity measure between cases. In this paper the authors consider an auto-fuzzy thesaurus which gives the relationship for values between corresponding indices and a cross-fuzzy thesaurus which gives the relationship for values between mutually distinct indices, then defines a similarity measure between cases which considers the relationship of index values based on these thesauri. This definition satisfies the characteristics required for the operation of case-based retrieval even when one value is not necessarily given in the index. Finally, using a test similar case retrieval system, the authors perform a comparative analysis of the proposed similarity measure between cases and a conventional approach
[ "similar case retrieval", "corresponding indices", "auto-fuzzy thesaurus", "cross-fuzzy thesaurus", "mutually distinct indices", "case-based retrieval", "case similarity measure", "relationship indices", "decision making support system", "problem solving" ]
[ "P", "P", "P", "P", "P", "P", "R", "R", "M", "M" ]
602
Image fusion between /sup 18/FDG-PET and MRI/CT for radiotherapy planning of oropharyngeal and nasopharyngeal carcinomas
Accurate diagnosis of tumor extent is important in three-dimensional conformal radiotherapy. This study reports the use of image fusion between (18)F-fluoro-2-deoxy-D-glucose positron emission tomography (/sup 18/FDG-PET) and magnetic resonance imaging/computed tomography (MRI/CT) for better targets delineation in radiotherapy planning of head-and-neck cancers. The subjects consisted of 12 patients with oropharyngeal carcinoma and 9 patients with nasopharyngeal carcinoma (NPC) who were treated with radical radiotherapy between July 1999 and February 2001. Image fusion between /sup 18/FDG-PET and MRI/CT was performed using an automatic multimodality image registration algorithm, which used the brain as an internal reference for registration. Gross tumor volume (GTV) was determined based on clinical examination and /sup 18/FDG uptake on the fusion images. Clinical target volume (CTV) was determined following the usual pattern of lymph node spread for each disease entity along with the clinical presentation of each patient. Except for 3 cases with superficial tumors, all the other primary tumors were detected by /sup 18/FDG-PET. The GTV volumes for primary tumors were not changed by image fusion in 19 cases (89%), increased by 49% in one NPC, and decreased by 45% in another NPC. Normal tissue sparing was more easily performed based on clearer GTV and CTV determination on the fusion images. In particular, parotid sparing became possible in 15 patients (71%) whose upper neck areas near the parotid glands were tumor-free by /sup 18/FDG-PET. Within a mean follow-up period of 18 months, no recurrence occurred in the areas defined as CTV, which was treated prophylactically, except for 1 patient who experienced nodal recurrence in the CTV and simultaneous primary site recurrence. In conclusion, this preliminary study showed that image fusion between /sup 18/FDG-PET and MRI/CT was useful in GTV and CTV determination in conformal RT, thus sparing normal tissues
[ "image fusion", "/sup 18/FDG-PET", "MRI/CT", "radiotherapy planning", "nasopharyngeal carcinomas", "oropharyngeal carcinomas", "superficial tumors", "primary tumors", "normal tissues sparing", "parotid glands", "simultaneous primary site recurrence", "F" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
647
Experimental design methodology and data analysis technique applied to optimise an organic synthesis
The study was aimed at maximising the yield of a Michaelis-Becker dibromoalkane monophosphorylation reaction. In order to save time and money, we first applied a full factorial experimental design to search for the optimum conditions while performing a small number of experiments. We then used the principal component analysis (PCA) technique to evidence two uncontrolled factors. Lastly, a special experimental design that took into account all the influential factors allowed us to determine the maximum-yield experimental conditions. This study also evidenced the complementary nature of experimental design methodology and data analysis techniques
[ "data analysis technique", "organic synthesis", "Michaelis-Becker dibromoalkane monophosphorylation reaction", "full factorial experimental design", "optimum conditions", "principal component analysis", "uncontrolled factors", "maximum-yield experimental conditions" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
80
Evaluating the performance of a distributed database of repetitive elements in complete genomes
The original version of the Repeat Sequence Database (RSDB) was created based on centralized database systems (CDBSs). RSDB presently includes an enormous amount of data, with the amount of biological data increasing rapidly. Distributed RSDB (DRSDB) is developed to yield better performance. This study proposed many approaches to data distribution and experimentally determines the best approach to obtain good performance of our database. Experimental results indicate that DRSDB performs well for particular types of query
[ "repetitive elements", "complete genomes", "biological data", "data distribution", "queries", "distributed Repeat Sequence Database", "performance evaluation" ]
[ "P", "P", "P", "P", "P", "R", "R" ]
1113
Word spotting based on a posterior measure of keyword confidence
In this paper, an approach of keyword confidence estimation is developed that well combines acoustic layer scores and syllable-based statistical language model (LM) scores. An a posteriori (AP) confidence measure and its forward-backward calculating algorithm are deduced. A zero false alarm (ZFA) assumption is proposed for evaluating relative confidence measures by word spotting task. In a word spotting experiment with a vocabulary of 240 keywords, the keyword accuracy under the AP measure is above 94%, which well approaches its theoretical upper limit. In addition, a syllable lattice Hidden Markov Model (SLHMM) is formulated and a unified view of confidence estimation, word spotting, optimal path search, and N-best syllable re-scoring is presented. The proposed AP measure can be easily applied to various speech recognition systems as well
[ "word spotting", "a posterior measure", "keyword confidence", "confidence estimation", "acoustic layer scores", "forward-backward calculating algorithm", "relative confidence measures", "word spotting task", "syllable lattice hidden Markov model", "optimal path search", "N-best syllable re-scoring", "speech recognition systems", "syllable-based statistical language model scores", "a posteriori confidence measure", "zero false alarm assumption" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1156
Favorable noise uniformity properties of Fourier-based interpolation and reconstruction approaches in single-slice helical computed tomography
Volumes reconstructed by standard methods from single-slice helical computed tomography (CT) data have been shown to have noise levels that are highly nonuniform relative to those in conventional CT. These noise nonuniformities can affect low-contrast object detectability and have also been identified as the cause of the zebra artifacts that plague maximum intensity projection (MIP) images of such volumes. While these spatially variant noise levels have their root in the peculiarities of the helical scan geometry, there is also a strong dependence on the interpolation and reconstruction algorithms employed. In this paper, we seek to develop image reconstruction strategies that eliminate or reduce, at its source, the nonuniformity of noise levels in helical CT relative to that in conventional CT. We pursue two approaches, independently and in concert. We argue, and verify, that Fourier-based longitudinal interpolation approaches lead to more uniform noise ratios than do the standard 360LI and 180LI approaches. We also demonstrate that a Fourier-based fan-to-parallel rebinning algorithm, used as an alternative to fanbeam filtered backprojection for slice reconstruction, also leads to more uniform noise ratios, even when making use of the 180LI and 360LI interpolation approaches
[ "noise uniformity properties", "Fourier-based interpolation", "reconstruction approaches", "single-slice helical computed tomography", "conventional CT", "low-contrast object detectability", "zebra artifacts", "more uniform noise ratios", "Fourier-based fan-to-parallel rebinning algorithm", "medical diagnostic imaging", "maximum intensity projection images", "helical span geometry" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R", "M" ]
991
Estimation of blocking probabilities in cellular networks with dynamic channel assignment
Blocking probabilities in cellular mobile communication networks using dynamic channel assignment are hard to compute for realistic sized systems. This computational difficulty is due to the structure of the state space, which imposes strong coupling constraints amongst components of the occupancy vector. Approximate tractable models have been proposed, which have product form stationary state distributions. However, for real channel assignment schemes, the product form is a poor approximation and it is necessary to simulate the actual occupancy process in order to estimate the blocking probabilities. Meaningful estimates of the blocking probability typically require an enormous amount of CPU time for simulation, since blocking events are usually rare. Advanced simulation approaches use importance sampling (IS) to overcome this problem. We study two regimes under which blocking is a rare event: low-load and high cell capacity. Our simulations use the standard clock (SC) method. For low load, we propose a change of measure that we call static ISSC, which has bounded relative error. For high capacity, we use a change of measure that depends on the current state of the network occupancy. This is the dynamic ISSC method. We prove that this method yields zero variance estimators for single clique models, and we empirically show the advantages of this method over naive simulation for networks of moderate size and traffic loads
[ "dynamic channel assignment", "cellular mobile communication networks", "strong coupling constraints", "occupancy vector", "approximate tractable models", "product form stationary state distributions", "simulation", "CPU time", "importance sampling", "low-load", "high cell capacity", "bounded relative error", "dynamic ISSC method", "zero variance estimators", "single clique models", "blocking probability estimation", "standard clock method", "static ISSC method", "quality of service", "network traffic load" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "M", "R" ]
546
Real-time quasi-2-D inversion of array resistivity logging data using neural network
We present a quasi-2-D real-time inversion algorithm for a modern galvanic array tool via dimensional reduction and neural network simulation. Using reciprocity and superposition, we apply a numerical focusing technique to the unfocused data. The numerically focused data are much less subject to 2-D and layering effects and can be approximated as from a cylindrical 1-D Earth. We then perform 1-D inversion on the focused data to provide approximate information about the 2-D resistivity structure. A neural network is used to perform forward modeling in the 1-D inversion, which is several hundred times faster than conventional numerical forward solutions. Testing our inversion algorithm on both synthetic and field data shows that this fast inversion algorithm is useful for providing formation resistivity information at a well site
[ "real-time quasi-2-D inversion", "array resistivity logging data", "neural network", "real-time inversion algorithm", "galvanic array tool", "dimensional reduction", "reciprocity", "superposition", "numerical focusing technique", "unfocused data", "focused data", "1-D inversion", "forward modeling", "formation resistivity", "well site" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
687
Image reconstruction of simulated specimens using convolution back projection
This paper reports the reconstruction of cross-sections of composite structures. The convolution back projection (CBP) algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications
[ "image reconstruction", "simulated specimens", "convolution back projection", "composite structures", "attenuation field", "filters", "reconstruction errors", "CBP algorithm", "composite regions", "engineering applications", "computerised tomography" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U" ]
1297
Stochastic optimization of acoustic response - a numerical and experimental comparison
The objective of the work presented is to compare results from numerical optimization with experimental data and to highlight and discuss the differences between two fundamentally different optimization methods. The problem domain is minimization of acoustic emission and the structure used in the work is a closed cylinder with forced vibration of one end. The optimization method used in this paper is simulated annealing (SA), a stochastic method. The results are compared with those from a gradient-based method used on the same structure in an earlier paper (Tinnsten, 2000)
[ "stochastic optimization", "acoustic response", "numerical optimization", "structure", "closed cylinder", "forced vibration", "simulated annealing", "gradient-based method", "acoustic emission minimization" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
112
Revisiting Hardy's paradox: Counterfactual statements, real measurements, entanglement and weak values
Hardy's (1992) paradox is revisited. Usually the paradox is dismissed on grounds of counterfactuality, i.e., because the paradoxical effects appear only when one considers results of experiments which do not actually take place. We suggest a new set of measurements in connection with Hardy's scheme, and show that when they are actually performed, they yield strange and surprising outcomes. More generally, we claim that counterfactual paradoxes point to a deeper structure inherent to quantum mechanics
[ "counterfactual statements", "real measurements", "entanglement", "weak values", "paradoxical effects", "quantum mechanics", "Hardy paradox", "gedanken-experiments" ]
[ "P", "P", "P", "P", "P", "P", "R", "U" ]
951
How to drive strategic innovation [law firms]
Innovation. It has everything to do with organization and attitude. Marginal improvement isn't enough anymore. Convert your problem-solving skills into a new value for the entire firm. 10 initiatives
[ "strategic innovation", "law firms", "management", "change", "clients", "experiments" ]
[ "P", "P", "U", "U", "U", "U" ]
914
A knowledge management framework for the support of decision making in humanitarian assistance/disaster relief
The major challenge in current humanitarian assistance/disaster relief (HA/DR) efforts is that diverse information and knowledge are widely distributed and owned by different organizations. These resources are not efficiently organized and utilized during HA/DR operations. We present a knowledge management framework that integrates multiple information technologies to collect, analyze, and manage information and knowledge for supporting decision making in HA/DR. The framework will help identify the information needs, be aware of a disaster situation, and provide decision-makers with useful relief recommendations based on past experience. A comprehensive, consistent and authoritative knowledge base within the framework will facilitate knowledge sharing and reuse. This framework can also be applied to other similar real-time decision-making environments, such as crisis management and emergency medical assistance
[ "knowledge management framework", "humanitarian assistance", "disaster relief", "organizations", "information technology", "information needs", "knowledge sharing", "real-time decision-making environments", "crisis management", "emergency medical assistance", "decision support system", "knowledge reuse", "case-based reasoning" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R", "U" ]
586
A strategy for a payoff-switching differential game based on fuzzy reasoning
In this paper, a new concept of a payoff-switching differential game is introduced. In this new game, any one player at any time may have several choices of payoffs for the future. Moreover, the payoff-switching process, including the time of payoff switching and the outcome payoff, of any one player is unknown to the other. Indeed, the overall payoff, which is a sequence of several payoffs, is unknown until the game ends. An algorithm for determining a reasoning strategy based on fuzzy reasoning is proposed. In this algorithm, the fuzzy theory is used to estimate the behavior of one player during a past time interval. By deriving two fuzzy matrices GSM, game similarity matrix, and VGSM, variation of GSM, the behavior of the player can be quantified. Two weighting vectors are selected to weight the relative importance of the player's behavior at each past time instant. Finally a simple fuzzy inference rule is adopted to generate a linear reasoning strategy. The advantage of this algorithm is that it provides a flexible way for differential game specialists to convert their knowledge into a "reasonable" strategy. A practical example of guarding three territories is given to illustrate our main ideas
[ "payoff-switching differential game", "differential game", "fuzzy reasoning", "payoff switching", "outcome payoff", "reasoning strategy", "fuzzy matrices", "game similarity matrix", "weighting vectors", "fuzzy inference" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1196
Multiple shooting using a dichotomically stable integrator for solving differential-algebraic equations
In previous work by the first author, it has been established that a dichotomically stable discretization is needed when solving a stiff boundary-value problem in ordinary differential equations (ODEs), when sharp boundary layers may occur at each end of the interval. A dichotomically stable implicit Runge-Kutta method, using the 3-stage, fourth-order, Lobatto IIIA formulae, has been implemented in a variable step-size initial-value integrator, which could be used in a multiple-shooting approach. In the case of index-one differential-algebraic equations (DAEs) the use of the Lobatto IIIA formulae has an advantage, over a comparable Gaussian method, that the order is the same for both differential and algebraic variables, and there is no need to treat them separately. The ODE integrator has been adapted for the solution of index-one DAEs, and the resulting integrator (SYMDAE) has been inserted into the multiple-shooting code (MSHDAE) previously developed by R. Lamour for differential-algebraic boundary-value problems. The standard version of MSHDAE uses a BDF integrator, which is not dichotomically stable, and for some stiff test problems this fails to integrate across the interval of interest, while the dichotomically stable integrator SYMDAE encounters no difficulty. Indeed, for such problems, the modified version of MSHDAE produces an accurate solution, and within limits imposed by computer word length, the efficiency of the solution process improves with increasing stiffness. For some nonstiff problems, the solution is also entirely satisfactory
[ "multiple shooting", "dichotomically stable integrator", "differential-algebraic equations", "stiff boundary-value problem", "ordinary differential equations", "implicit Runge-Kutta method", "Lobatto IIIA formulae", "initial-value integrator" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
809
Edison's direct current influenced "Broadway" show lighting
During the early decades of the 20 th century, midtown Manhattan in New York City developed an extensive underground direct current (DC) power distribution system. This was a result of the original introduction of direct current by Thomas Edison's pioneering Pearl Street Station in 1882. The availability of DC power in the theater district, led to the perpetuation of an archaic form of stage lighting control through nearly three-quarters of the 20 th century. This control device was known as a "resistance dimmer." It was essentially a series-connected rheostat, but it was wound with a special resistance "taper" so as to provide a uniform change in the apparent light output of typical incandescent lamps throughout the travel of its manually operated arm. The development and use of DC powered stage lighting is discussed in this article
[ "Manhattan", "New York City", "theater district", "stage lighting control", "resistance dimmer", "series-connected rheostat", "apparent light output", "incandescent lamps", "DC powered stage lighting", "Broadway show lighting", "underground direct current power distribution system", "Thomas Edison's Pearl Street Station", "resistance taper" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R" ]
767
Quantum computation for physical modeling
One of the most famous American physicists of the twentieth century, Richard Feynman, in 1982 was the first to propose using a quantum mechanical computing device to efficiently simulate quantum mechanical many-body dynamics, a task that is exponentially complex in the number of particles treated and is completely intractable by any classical computing means for large systems of many particles. In the two decades following his work, remarkable progress has been made both theoretically and experimentally in the new field of quantum computation
[ "quantum computation", "physical modeling", "quantum mechanical computing", "quantum mechanical many-body dynamics" ]
[ "P", "P", "P", "P" ]
722
Updating systems for monitoring and controlling power equipment on the basis of the firmware system SARGON
The economic difficulties experienced by the power industry of Russia has considerably retarded the speed of commissioning new capacities and reconstructing equipment in service. The increasing deterioration of the equipment at power stations makes the problem of its updating very acute. The main efforts of organizations working in the power industry are now focused on updating all kinds of equipment installed at power installations. The necessary condition for the efficient operation of power equipment is to carry out serious modernization of systems for monitoring and control (SMC) of technological processes. The specialists at ZAO NVT-Avtomatika have developed efficient technology for updating the SMC on the basis of the firmware system SARGON which ensures the fast introduction of high-quality systems of automation with a minimal payback time of the capital outlay. This paper discusses the updating of equipment using SARGON
[ "power industry", "Russia", "ZAO NVT-Avtomatika", "SARGON firmware system", "monitoring systems", "control systems", "power equipment monitoring", "power equipment control" ]
[ "P", "P", "P", "R", "R", "R", "R", "R" ]
1377
Open hypermedia for product support
As industrial systems become increasingly more complex, the maintenance and operating information increases both in volume and complexity. With the current pressures on manufacturing, the management of information resources has become a critical issue. In particular, ensuring that personnel can access current information quickly and effectively when undertaking a specific task. This paper discusses some of the issues involved in, and the benefits of using, open hypermedia to manage and deliver a diverse range of information. While the paper concentrates on the problems specifically associated with manufacturing organizations, the problems are generic across other business sectors such as healthcare, defence and finance. The open hypermedia approach to information management and delivery allows a multimedia resource base to be used for a range of applications and it permits a user to have controlled access to the required information in an easily accessible and structured manner. Recent advancement in hypermedia also permits just-in-time support in the most appropriate format for all users. Our approach is illustrated by the discussion of a case study in which an open hypermedia system delivers maintenance and process information to factory-floor users to support the maintenance and operation of a very large manufacturing cell
[ "open hypermedia", "product support", "maintenance", "operating information", "information resources", "just-in-time support" ]
[ "P", "P", "P", "P", "P", "P" ]
1332
Personal cards for on-line purchases
Buying presents over the Web has advantages for a busy person: lots of choices, 24-hour accessibility, quick delivery, and you don't even have to wrap the gift. But many people like to select a card or write a personal note to go with their presents, and the options for doing that have been limited. Two companies have seen this limitation as an opportunity: 4YourSoul.com and CardintheBox.com
[ "personal cards", "4YourSoul.com", "CardintheBox.com", "personalized printing", "online purchases" ]
[ "P", "P", "P", "M", "M" ]
1076
Delayed-choice entanglement swapping with vacuum-one-photon quantum states
We report the experimental realization of a recently discovered quantum-information protocol by Peres implying an apparent nonlocal quantum mechanical retrodiction effect. The demonstration is carried out by a quantum optical method by which each singlet entangled state is physically implemented by a two-dimensional subspace of Fock states of a mode of the electromagnetic field, specifically the space spanned by the vacuum and the one-photon state, along lines suggested recently by E. Knill et al. [Nature (London) 409, 46 (2001)] and by M. Duan et al. [ibid. 414, 413 (2001)]
[ "delayed-choice entanglement", "vacuum-one-photon quantum states", "quantum-information", "nonlocal quantum mechanical retrodiction effect", "quantum optical method", "singlet entangled state", "two-dimensional subspace", "Fock states", "one-photon state", "state entanglement", "electromagnetic field mode", "vacuum state" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1033
Optical two-step modified signed-digit addition based on binary logic gates
A new modified signed-digit (MSD) addition algorithm based on binary logic gates is proposed for parallel computing. It is shown that by encoding each of the input MSD digits and flag digits into a pair of binary bits, the number of addition steps can be reduced to two. The flag digit is introduced to characterize the next low order pair (NLOP) of the input digits in order to suppress carry propagation. The rules for two-step addition of binary coded MSD (BCMSD) numbers are formulated that can be implemented using optical shadow-casting logic system
[ "optical two-step modified signed-digit addition", "binary logic gates", "parallel computing", "input MSD digits", "flag digits", "binary bits", "addition steps", "low order pair", "two-step addition", "binary coded MSD", "optical shadow-casting logic system", "modified signed-digit addition algorithm", "carry propagation suppression" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R" ]
64
Speech enhancement using a mixture-maximum model
We present a spectral domain, speech enhancement algorithm. The new algorithm is based on a mixture model for the short time spectrum of the clean speech signal, and on a maximum assumption in the production of the noisy speech spectrum. In the past this model was used in the context of noise robust speech recognition. In this paper we show that this model is also effective for improving the quality of speech signals corrupted by additive noise. The computational requirements of the algorithm can be significantly reduced, essentially without paying performance penalties, by incorporating a dual codebook scheme with tied variances. Experiments, using recorded speech signals and actual noise sources, show that in spite of its low computational requirements, the algorithm shows improved performance compared to alternative speech enhancement algorithms
[ "mixture-maximum model", "spectral domain", "speech enhancement algorithm", "mixture model", "short time spectrum", "clean speech signal", "noisy speech spectrum", "noise robust speech recognition", "additive noise", "performance penalties", "dual codebook", "tied variances", "recorded speech signals", "noise sources", "low computational requirements", "speech signal quality", "Gaussian mixture model", "MIXMAX model", "speech intelligibility" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M", "M", "M" ]
136
Design of 1-D and 2-D variable fractional delay allpass filters using weighted least-squares method
In this paper, a weighted least-squares method is presented to design one-dimensional and two-dimensional variable fractional delay allpass filters. First, each coefficient of the variable allpass filter is expressed as the polynomial of the fractional delay parameter. Then, the nonlinear phase error is approximated by a weighted equation error such that the cost function can be converted into a quadratic form. Next, by minimizing the weighted equation error, the optimal polynomial coefficients can be obtained iteratively by solving a set of linear simultaneous equations at each iteration. Finally, the design examples are demonstrated to illustrate the effectiveness of the proposed approach
[ "variable fractional delay allpass filters", "weighted least-squares method", "fractional delay parameter", "weighted equation error", "cost function", "optimal polynomial coefficients", "linear simultaneous equations", "1D allpass filters", "2D allpass filters", "nonlinear phase error approximation" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "M", "R" ]
975
Algebraic conditions for high-order convergent deferred correction schemes based on Runge-Kutta-Nystrom methods for second order boundary value problems
In [T. Van Hecke, M. Van Daele, J. Comp. Appl. Math., vol. 132, p. 107-125, (2001)] the investigation of high-order convergence of deferred correction schemes for the numerical solution of second order nonlinear two-point boundary value problems not containing the first derivative, is made. The derivation of the algebraic conditions to raise the increase of order by the deferred correction scheme was based on Taylor series expansions. In this paper we describe a more elegant way by means of P-series to obtain this necessary conditions and generalize this idea to equations of the form y" = f (t, y, y')
[ "algebraic conditions", "high-order convergent deferred correction schemes", "deferred correction schemes", "Runge-Kutta-Nystrom methods", "second order boundary value problems", "second order nonlinear two-point boundary value problems", "Taylor series expansions" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
930
NARX-based technique for the modelling of magneto-rheological damping devices
This paper presents a methodology for identifying variable-structure nonlinear models of magneto-rheological dampers (MRD) and similar devices. Its peculiarity with respect to the mainstream literature is to be especially conceived for obtaining models that are structurally simple, easy to estimate and well suited for model-based control. This goal is pursued by adopting linear-in-the-parameters NARX models, for which an identification method is developed based on the minimization of the simulation error. This method is capable of selecting the model structure together with the parameters, thus it does not require a priori structural information. A set of validation tests is reported, with the aim of demonstrating the technique's efficiency by comparing it to a widely accepted MRD modelling approach
[ "modelling", "identification", "model-based control", "NARX models", "minimization", "simulation error", "validation", "MRD modelling", "magnetorheological damping" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "M" ]
988
A new merging algorithm for constructing suffix trees for integer alphabets
A new approach for constructing a suffix tree T/sub s/ for a given string S is to construct recursively a suffix tree T/sub o/ for odd positions, construct a suffix, tree T/sub e/ for even positions from T/sub o/ and then merge T/sub o/ and T/sub e/ into T/sub s/. To construct suffix trees for integer alphabets in linear time had been a major open problem on index data structures. Farach used this approach and gave the first linear-time algorithm for integer alphabets. The hardest part of Farach's algorithm is the merging step. In this paper we present a new and simpler merging algorithm based on a coupled BFS (breadth-first search). Our merging algorithm is more intuitive than Farach's coupled DFS (depth-first search) merging, and thus it can be easily extended to other applications
[ "merging algorithm", "suffix trees", "integer alphabets", "linear time", "index data structures", "coupled BFS", "breadth-first search", "recursive construction" ]
[ "P", "P", "P", "P", "P", "P", "P", "R" ]
99
Radianz and Savvis look to expand service in wake of telecom scandals [finance]
With confidence in network providers waning, Radianz and Savvis try to prove their stability. Savvis and Radianz, which both specialize in providing the data-extranet components of telecommunication infrastructures, may see more networking doors open at investment banks, brokerage houses, exchanges and alternative-trading systems
[ "Radianz", "Savvis", "network providers", "data-extranet", "telecommunication infrastructures", "investment banks", "brokerage houses", "exchanges", "alternative-trading systems" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
895
Algorithms for improving the quality of R-trees
A novel approach to operation with a structure for spatial indexing of extended objects shaped as R-trees is considered. It consists of the initial global construction of an efficient R-tree structure and the subsequent operation with it using conventional dynamic algorithms. A global strategy for constructing an R-tree reduced to a problem of dividing a set of rectangular objects into K parts with minimum mutual overlay is suggested. Base, box, and "Divide and Conquer" algorithms are suggested. The results of experimental modeling of the execution of various algorithms are discussed
[ "R-trees", "spatial indexing", "extended objects", "dynamic algorithms", "rectangular objects", "minimum mutual overlay", "graphical search", "computational geometry" ]
[ "P", "P", "P", "P", "P", "P", "U", "U" ]
1052
Developing a high-performance web server in Concurrent Haskell
Server applications, and in particular network-based server applications, place a unique combination of demands on a programming language: lightweight concurrency, high I/O throughput, and fault tolerance are all important. This paper describes a prototype Web server written in Concurrent Haskell (with extensions), and presents two useful results: firstly, a conforming server could be written with minimal effort, leading to an implementation in less than 1500 lines of code, and secondly the naive implementation produced reasonable performance. Furthermore, making minor modifications to a few time-critical components improved performance to a level acceptable for anything but the most heavily loaded Web servers
[ "high-performance Web server", "Concurrent Haskell", "network-based server applications", "lightweight concurrency", "high I/O throughput", "fault tolerance", "conforming server", "time-critical components" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1017
Searching a scalable approach to cerebellar based control
Decades of research into the structure and function of the cerebellum have led to a clear understanding of many of its cells, as well as how learning might take place. Furthermore, there are many theories on what signals the cerebellum operates on, and how it works in concert with other parts of the nervous system. Nevertheless, the application of computational cerebellar models to the control of robot dynamics remains in its infant state. To date, few applications have been realized. The currently emerging family of light-weight robots poses a new challenge to robot control: due to their complex dynamics traditional methods, depending on a full analysis of the dynamics of the system, are no longer applicable since the joints influence each other dynamics during movement. Can artificial cerebellar models compete here?
[ "scalable approach", "cerebellar based control", "nervous system", "computational cerebellar models", "light-weight robots", "robot control" ]
[ "P", "P", "P", "P", "P", "P" ]
743
Local satellite
Consumer based mobile satellite phone services went from boom to burn up in twelve months despite original forecasts predicting 10 million to 40 million users by 2005. Julian Bright wonders what prospects the technology has now and if going regional might be one answer
[ "mobile satellite phone services" ]
[ "P" ]
706
Enhancing the reliability of modular medium-voltage drives
A method to increase the reliability of modular medium-voltage induction motor drives is discussed, by providing means to bypass a failed module. The impact on reliability is shown. A control, which maximizes the output voltage available after bypass, is described, and experimental results are given
[ "modular medium-voltage induction motor drives", "reliability enhancement", "failed module bypass", "available output voltage control" ]
[ "P", "R", "R", "R" ]
1353
Generalized spatio-chromatic diffusion
A framework for diffusion of color images is presented. The method is based on the theory of thermodynamics of irreversible transformations which provides a suitable basis for designing correlations between the different color channels. More precisely, we derive an equation for color evolution which comprises a purely spatial diffusive term and a nonlinear term that depends on the interactions among color channels over space. We apply the proposed equation to images represented in several color spaces, such as RGB, CIELAB, Opponent colors, and IHS
[ "generalized spatio-chromatic diffusion", "diffusion", "color images", "thermodynamics", "irreversible transformations", "color channels", "color evolution", "spatial diffusive term", "nonlinear term", "RGB", "CIELAB", "Opponent colors", "IHS", "vector-valued diffusion", "scale-space" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "U" ]
1316
Understanding Internet traffic streams: dragonflies and tortoises
We present the concept of network traffic streams and the ways they aggregate into flows through Internet links. We describe a method of measuring the size and lifetime of Internet streams, and use this method to characterize traffic distributions at two different sites. We find that although most streams (about 45 percent of them) are dragonflies, lasting less than 2 seconds, a significant number of streams have lifetimes of hours to days, and can carry a high proportion (50-60 percent) of the total bytes on a given link. We define tortoises as streams that last longer than 15 minutes. We point out that streams can be classified not only by lifetime (dragonflies and tortoises) but also by size (mice and elephants), and note that stream size and lifetime are independent dimensions. We submit that ISPs need to be aware of the distribution of Internet stream sizes, and the impact of the difference in behavior between short and long streams. In particular, any forwarding cache mechanisms in Internet routers must be able to cope with a high volume of short streams. In addition ISPs should realize that long-running streams can contribute a significant fraction of their packet and byte volumes-something they may not have allowed for when using traditional "flat rate user bandwidth consumption" approaches to provisioning and engineering
[ "Internet traffic streams", "dragonflies", "tortoises", "network traffic streams", "traffic distributions", "mice", "elephants", "ISP", "forwarding cache mechanisms", "Internet routers", "long-running streams", "Internet stream size measurement", "Internet stream lifetime measurement", "packet volume", "byte volume", "traffic provisioning", "traffic engineering" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "R", "R" ]
868
Two quantum analogues of Fisher information from a large deviation viewpoint of quantum estimation
We discuss two quantum analogues of the Fisher information, the symmetric logarithmic derivative Fisher information and Kubo-Mori-Bogoljubov Fisher information from a large deviation viewpoint of quantum estimation and prove that the former gives the true bound and the latter gives the bound of consistent superefficient estimators. As another comparison, it is shown that the difference between them is characterized by the change of the order of limits
[ "quantum analogues", "large deviation viewpoint", "quantum estimation", "symmetric logarithmic derivative Fisher information", "Kubo-Mori-Bogoljubov Fisher information", "consistent superefficient estimators", "statistical inference" ]
[ "P", "P", "P", "P", "P", "P", "U" ]
1092
Ride quality evaluation of an actively-controlled stretcher for an ambulance
This study considers the subjective evaluation of ride quality during ambulance transportation using an actively-controlled stretcher (ACS). The ride quality of a conventional stretcher and an assistant driver's seat is also compared. Braking during ambulance transportation generates negative foot-to-head acceleration in patients and causes blood pressure to rise in the patient's head. The ACS absorbs the foot-to-head acceleration by changing the angle of the stretcher, thus reducing the blood pressure variation. However, the ride quality of the ACS should be investigated further because the movement of the ACS may cause motion sickness and nausea. Experiments of ambulance transportation, including rapid acceleration and deceleration, are performed to evaluate the effect of differences in posture of the transported subject on the ride quality; the semantic differential method and factor analysis are used in the investigations. Subjects are transported using a conventional stretcher with head forward, a conventional stretcher with head backward, the ACS, and an assistant driver's seat for comparison with transportation using a stretcher. Experimental results show that the ACS gives the most comfortable transportation when using a stretcher. Moreover, the reduction of the negative foot-to-head acceleration at frequencies below 0.2 Hz and the small variation of the foot-to-head acceleration result in more comfortable transportation. Conventional transportation with the head forward causes the worst transportation, although the characteristics of the vibration of the conventional stretcher seem to be superior to that of the ACS
[ "ride quality evaluation", "actively-controlled stretcher", "ambulance", "subjective evaluation", "ambulance transportation", "conventional stretcher", "braking", "negative foot-to-head acceleration", "blood pressure variation", "motion sickness", "nausea", "rapid acceleration", "transported subject", "semantic differential method", "factor analysis", "head forward", "head backward", "comfortable transportation", "vibration", "assistant driver seat", "patient head", "stretcher angle", "rapid deceleration", "posture differences" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "R" ]
1443
C and C++: a case for compatibility
Modern C and C++ are sibling languages descended from Classic C. In many people's minds, they are (wrongly, but understandably) fused into the mythical C/C++ programming language. There is no C/C++ language, but there is a C/C++ community. Previously the author described some of the incompatibilities that complicate the work of developers within that C/C++ community. In this article, he discusses some of the underlying myths that help perpetuate these incompatibilities. He also shows why more compatibility (ideally, full compatibility) is in the best interest of the C/C++ community. In the next paper, he presents some examples of how the incompatibilities in C and C++ might be resolved
[ "C++ language", "incompatibilities", "C language", "object-oriented programming", "class hierarchies", "low-level programming", "C++ libraries" ]
[ "P", "P", "R", "M", "U", "M", "M" ]
1406
Bluetooth bites back
It is now more than four years since we started to hear about Bluetooth, and from the user's point of view very little seems to have happened since then. Paul Haddlesey looks at the progress, and the role Bluetooth may eventually play in your firm's communications strategy
[ "Bluetooth", "communications strategy", "wireless connection", "mobile" ]
[ "P", "P", "U", "U" ]
810
Oracle's Suite grows up
Once a low-cost Web offering, Oracle's Small Business Suite now carries a price tag to justify VAR interest
[ "Oracle Small Business Suite", "NetLedger", "accounting", "resellers" ]
[ "R", "U", "U", "U" ]
855
Support communities for women in computing
This article highlights the many activities provided by the support communities available for women in computing. Thousands of women actively participate in these programs and they receive many benefits including networking and professional support. In addition, the organizations and associations help promote the accomplishments of women computer scientists and disseminate valuable information. This article surveys some of these organizations and concludes with a list of suggestions for how faculty members can incorporate the benefits of these organizations in their own institutions
[ "support communities", "women", "computing", "networking", "professional support", "faculty members", "information dissemination" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1393
ERP systems implementation: Best practices in Canadian government organizations
ERP (Enterprise resource planning) systems implementation is a complex exercise in organizational innovation and change management. Government organizations are increasing their adoption of these systems for various benefits such as integrated real-time information, better administration, and result-based management. Government organizations, due to their social obligations, higher legislative and public accountability, and unique culture face many specific challenges in the transition to enterprise systems. This motivated the authors to explore the key considerations and typical activities in government organizations adopting ERP systems. The article adopts the innovation process theory framework as well as the (Markus & Tanis, 2000) model as a basis to delineate the ERP adoption process. Although, each adopting organization has a distinct set of objectives for its systems, the study found many similarities in motivations, concerns, and strategies across organizations
[ "ERP systems implementation", "best practices", "Canadian government organizations", "enterprise resource planning", "integrated real-time information", "administration", "result-based management", "social obligations", "public accountability", "innovation process theory framework", "higher legislative accountability" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
783
The network society as seen from Italy
Italy was behind the European average in Internet development for many years, but a new trend, which has brought considerable change, emerged at the end of 1998 and showed its effects in 2000 and the following years. Now Italy is one of the top ten countries worldwide in Internet hostcount and the fourth largest in Europe. The density of Internet activity in Italy in proportion to the population is still below the average in the European Union, but is growing faster than Germany, the UK and France, and faster than the worldwide or European average. From the point of view of media control there are several problems. Italy has democratic institutions and freedom of speech, but there is an alarming concentration in the control of mainstream media (especially broadcast). There are no officially declared restrictions in the use of the Internet, but several legal and regulatory decisions reveal a desire to limit freedom of opinion and dialogue and/or gain centralized control of the Net
[ "network society", "Italy", "European average", "Europe", "Internet development", "Internet hostcount", "Internet activity", "European Union", "Germany", "UK", "France", "media control", "democratic institutions", "freedom of speech", "mainstream media", "regulatory decisions", "centralized control", "worldwide average", "broadcast media", "legal decisions" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
1137
On deciding stability of constrained homogeneous random walks and queueing systems
We investigate stability of scheduling policies in queueing systems. To this day no algorithmic characterization exists for checking stability of a given policy in a given queueing system. In this paper we introduce a certain generalized priority policy and prove that the stability of this policy is algorithmically undecidable. We also prove that stability of a homogeneous random walk in L/sub +//sup d/ is undecidable. Finally, we show that the problem of computing a fluid limit of a queueing system or of a constrained homogeneous random walk is undecidable. To the best of our knowledge these are the first undecidability results in the area of stability of queueing systems and random walks in L/sub +//sup d/. We conjecture that stability of common policies like First-In-First-Out and priority policy is also an undecidable problem
[ "constrained homogeneous random walks", "queueing systems", "generalized priority policy", "priority policy", "undecidability results", "undecidable problem", "scheduling policy stability", "homogeneous random walk stability", "fluid limit computation", "first-in-first-out policy" ]
[ "P", "P", "P", "P", "P", "P", "R", "R", "R", "R" ]
1172
Marble cutting with single point cutting tool and diamond segments
An investigation has been undertaken into the frame sawing with diamond blades. The kinematic behaviour of the frame sawing process is discussed. Under different cutting conditions, cutting and indenting-cutting tests are carried out by single point cutting tools and single diamond segments. The results indicate that the depth of cut per diamond grit increases as the blades move forward. Only a few grits per segment can remove the material in the cutting process. When the direction of the stroke changes, the cutting forces do not decrease to zero because of the residual plastic deformation beneath the diamond grits. The plastic deformation and fracture chipping of material are the dominant removal processes, which can be explained by the fracture theory of brittle material indentation
[ "marble cutting", "single point cutting tool", "diamond segments", "frame sawing", "kinematic behaviour", "indenting-cutting tests", "residual plastic deformation", "fracture chipping", "removal processes", "fracture theory", "brittle material indentation", "cutting tests" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
562
The Advanced Encryption Standard - implementation and transition to a new cryptographic benchmark
Cryptography is the science of coding information to create unintelligible ciphers that conceal or hide messages. The process that achieves this goal is commonly referred to as encryption. Although encryption processes of various forms have been employed for centuries to protect the exchange of messages, the advent of the information age has underscored the importance of strong cryptography as a process to secure data exchanged through electronic means, and has accentuated the demand for products offering these services. This article describes the process that has led to the development of the latest cryptographic benchmark; the Advanced Encryption Standard (AES). The article briefly examines the requirements set forth for its development, defines how the new standard is implemented, and describes how government, business, and industry can transition to AES with minimum impact to operations
[ "Advanced Encryption Standard", "cryptographic benchmark", "coding", "unintelligible ciphers", "data exchange", "AES", "government", "business", "industry" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
1236
Compatibility comparison and performance evaluation for Japanese HPF compilers using scientific applications
The lack of compatibility of High-Performance Fortran (HPF) between vender implementations has been disheartening scientific application users so as to hinder the development of portable programs. Thus parallel computing is still unpopular in the computational science community, even though parallel programming is common to the computer science community. As users would like to run the same source code on parallel machines with different architectures as fast as possible, we have investigated the compatibility of source codes for Japanese HPF compilers (NEC, Fujitsu and Hitachi) with two real-world applications: a 3D fluid code and a 2D particle code. We have found that the source-level compatibility between Japanese HPF compilers is almost preserved, but more effort will be needed to sustain complete compatibility. We have also evaluated parallel performance and found that HPF can achieve good performance for the 3D fluid code with almost the same source code. For the 2D particle code, good results have also been obtained with a small number of processors, but some changes in the original source code and the addition of interface blocks is required
[ "HPF", "compilers", "High-Performance Fortran", "portable programs", "parallel programming", "parallel performance", "source compatability" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1273
Towards an ontology of approximate reason
This article introduces structural aspects in an ontology of approximate reason. The basic assumption in this ontology is that approximate reason is a capability of an agent. Agents are designed to classify information granules derived from sensors that respond to stimuli in the environment of an agent or received from other agents. Classification of information granules is carried out in the context of parameterized approximation spaces and a calculus of granules. Judgment in agents is a faculty of thinking about (classifying) the particular relative to decision rules derived from data. Judgment in agents is reflective, but not in the classical philosophical sense (e.g., the notion of judgment in Kant). In an agent, a reflective judgment itself is an assertion that a particular decision rule derived from data is applicable to an object (input). That is, a reflective judgment by an agent is an assertion that a particular vector of attribute (sensor) values matches to some degree the conditions for a particular rule. In effect, this form of judgment is an assertion that a vector of sensor values reflects a known property of data expressed by a decision rule. Since the reasoning underlying a reflective judgment is inductive and surjective (not based on a priori conditions or universals), this form of judgment is reflective, but not in the sense of Kant. Unlike Kant, a reflective judgment is surjective in the sense that it maps experimental attribute values onto the most closely matching descriptors (conditions) in a derived rule. Again, unlike Kant's notion of judgment, a reflective judgment is not the result of searching for a universal that pertains to a particular set of values of descriptors. Rather, a reflective judgment by an agent is a form of recognition that a particular vector of sensor values pertains to a particular rule in some degree. This recognition takes the form of an assertion that a particular descriptor vector is associated with a particular decision rule. These considerations can be repeated for other forms of classifiers besides those defined by decision rules
[ "ontology", "approximate reason", "information granules", "granules", "parameterized approximation spaces", "decision rules", "reflective judgment", "pattern recognition", "rough sets" ]
[ "P", "P", "P", "P", "P", "P", "P", "M", "M" ]
626
Approximate confidence intervals for one proportion and difference of two proportions
Constructing a confidence interval for a binomial proportion or the difference of two proportions is a routine exercise in daily data analysis. The best-known method is the Wald interval based on the asymptotic normal approximation to the distribution of the observed sample proportion, though it is known to have bad performance for small to medium sample sizes. Agresti et al. (1998, 2000) proposed an Adding-4 method: 4 pseudo-observations are added with 2 successes and 2 failures and then the resulting (pseudo-)sample proportion is used. The method is simple and performs extremely well. Here we propose an approximate method based on a t-approximation that takes account of the uncertainty in estimating the variance of the observed (pseudo-)sample proportion. It follows the same line of using a t-test, rather than z-test, in testing the mean of a normal distribution with an unknown variance. For some circumstances our proposed method has a higher coverage probability than the Adding-4 method
[ "approximate confidence intervals", "difference of two proportions", "binomial proportion", "data analysis", "t-approximation", "uncertainty", "t-test", "normal distribution", "coverage probability", "variance estimation", "pseudo-sample proportion" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "M" ]
59
Efficient tracking of the cross-correlation coefficient
In many (audio) processing algorithms, involving manipulation of discrete-time signals, the performance can vary strongly over the repertoire that is used. This may be the case when the signals from the various channels are allowed to be strongly positively or negatively correlated. We propose and analyze a general formula for tracking the (time-dependent) correlation between two signals. Some special cases of this formula lead to classical results known from the literature, others are new. This formula is recursive in nature, and uses only the instantaneous values of the two signals, in a low-cost and low-complexity manner; in particular, there is no need to take square roots or to carry out divisions. Furthermore, this formula can be modified with respect to the occurrence of the two signals so as to further decrease the complexity, and increase ease of implementation. The latter modification comes at the expense that not the actual correlation is tracked, but, rather, a somewhat deformed version of it. To overcome this problem, we propose, for a number of instances of the tracking formula, a simple warping operation on the deformed correlation. Now we obtain, at least for sinusoidal signals, the correct value of the correlation coefficient. Special attention is paid to the convergence behavior of the algorithm for stationary signals and the dynamic behavior if there is a transition to another stationary state; the latter is considered to be important to study the tracking abilities to nonstationary signals. We illustrate tracking algorithm by using it for stereo music fragments, obtained from a number of digital audio recordings
[ "efficient tracking", "cross-correlation coefficient", "discrete-time signals", "warping operation", "deformed correlation", "sinusoidal signals", "convergence behavior", "stationary signals", "dynamic behavior", "stationary state", "nonstationary signals", "tracking algorithm", "stereo music fragments", "digital audio recording", "audio processing algorithms", "time-dependent correlation", "recursive formula" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R" ]
663
The road ahead [supply chains]
Executive supply chain managers, says David Metcalfe of Forrester Research, need the skills and precision of Mongolian archers on horseback. They must be able to hit their target, in this case customer demand, while moving at great speed. But what is wrong with the supply chains companies have in place already? According to Metcalfe, current manufacturing models are too inflexible. A recent survey conducted by Forrester Research supports this claim. It found that 42% of respondents could not transfer production from one plant to another in the event of a glitch in the supply chain. A further 32% said it would be possible, but extremely costly
[ "supply chains", "Forrester Research", "manufacturing", "survey", "business networks" ]
[ "P", "P", "P", "P", "U" ]
948
Pairwise thermal entanglement in the n-qubit (n <or= 5) Heisenberg XX chain
We have calculated the concurrence of the pairwise thermal entanglement for the four-qubit and five-qubit Heisenberg XX chain. It is found that there is a great difference between the even-qubit and the odd-qubit chain in the aspect of the critical temperature and of the existence of the entanglement for the case of the qubit number n no more than 5
[ "pairwise thermal entanglement", "five-qubit Heisenberg XX chain", "odd-qubit chain", "critical temperature", "four-qubit Heisenberg XX chain", "even-qubit chain" ]
[ "P", "P", "P", "P", "R", "R" ]
1391
Government budget and accounting information policy and practice in Taiwan
The principal government budget and accounting information policies in Taiwan are founded on the ability to provide integrated, consistent, and timely information for government managers to make more rational decisions concerning national resource allocation and evaluation. A specific accounting organization system has been designed for this purpose. This paper analyzes information policies and practices according to the relevant laws and regulations, identifies issues regarding the policies, and presents strategies to resolve the issues
[ "Government budget", "accounting information policy", "Taiwan", "government managers", "rational decisions", "national resource allocation", "national resource evaluation", "Generally Accepted Accounting Principles" ]
[ "P", "P", "P", "P", "P", "P", "R", "M" ]
781
ICANN and Internet governance: leveraging technical coordination to realize global public policy
The Internet Corporation for Assigned Names and Numbers (ICANN) was created in 1998 to perform technical coordination of the Internet. ICANN also lays the foundations for governance, creating capabilities for promulgating and enforcing global regulations on Internet use. ICANN leverages the capabilities in the Internet domain name system (DNS) to implement four mechanisms of governance: authority, law, sanctions, and jurisdictions. These governance-related features are embodied in seemingly technical features of ICANN's institutional design. Recognition of ICANN's governance mechanisms allows us to better understand the Internet's emerging regulatory regime
[ "ICANN", "Internet governance", "technical coordination", "global public policy", "Internet Corporation for Assigned Names and Numbers", "global regulations", "Internet use", "Internet domain name system", "governance-related features", "institutional design", "regulatory regime", "Internet DNS" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R" ]
1028
Novel approach to super-resolution pits readout
We proposed a novel method to realize the readout of super-resolution pits by using a super-resolution reflective film to replace the reflective layer of the conventional ROM. At the same time, by using Sb as the super-resolution reflective layer and SiN as a dielectric layer, the super-resolution pits with diameters of 380 nm were read out by a setup whose laser wavelength is 632.8 nm and numerical aperture is 0.40. In addition, the influence of the Sb thin film thickness on the readout signal was investigated, the results showed that the optimum Sb thin film thickness is 28 to 30 nm, and the maximum CNR is 38 to 40 dB
[ "super-resolution pits readout", "super-resolution reflective film", "380 nm", "632.8 nm", "numerical aperture", "Sb thin film thickness", "readout signal", "28 to 30 nm", "maximum CNR", "Sb super-resolution reflective layer", "SiN dielectric layer", "Sb-SiN" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "U" ]
1090
On the contractivity of implicit-explicit linear multistep methods
This paper is concerned with the class of implicit-explicit linear multistep methods for the numerical solution of initial value problems for ordinary differential equations which are composed of stiff and nonstiff parts. We study the contractivity of such methods, with regard to linear autonomous systems of ordinary differential equations and a (scaled) Euclidean norm. In addition, we derive a strong stability result based on the stability regions of these methods
[ "contractivity", "implicit-explicit linear multistep methods", "numerical solution", "initial value problems", "ordinary differential equations", "linear autonomous systems", "Euclidean norm", "stability result" ]
[ "P", "P", "P", "P", "P", "P", "P", "P" ]
1441
Handles and exception safety, Part 1. A simple handle class
Every C++ program that uses inheritance must manage memory somehow. The most obvious way to do so is directly, but programmers who create complicated data structures often have trouble figuring out what parts of those data structures are safe to delete when. The classical method of dealing with such complexity is to hide it in a class. Such classes are typically called handles; the idea is to attach a handle object to another object that contains the actual data. The simplest form of a handle, which we have discussed in this article, is one in which each handle object corresponds to a single object from the inheritance hierarchy. Such handles are straightforward to use and to implement and tend to be intrinsically exception safe in almost all respects. The one exception hazard in such a class is typically the assignment operator. Assignment operators often test for self-assignment to avoid aliasing problems. As Herb Sutter has observed (2000), programs that need such tests are almost always exception unsafe. By rewriting the assignment operator, we ensure that we do not do anything irrevocable until the possibility of throwing an exception has passed. This strategy ensures that if an exception occurs while our assignment operator is executing, we do not corrupt the rest of our system
[ "handles", "exception", "C++ program", "inheritance hierarchy", "assignment operator", "self-assignment", "aliasing problems" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
1404
Creating the right mail model
If you know your post room is not as efficiently organised as it might be, but you are not sure how best to go about making improvements, then consider this advice from John Edgar of consultant MCS
[ "mail", "post room", "consultant", "MCS" ]
[ "P", "P", "P", "P" ]
812
eLeaders make the Web work
Some companies are making the most of back-office/Web integration. Here are some winners
[ "back-office/Web integration", "e-commerce", "Visual Integrator", "Accpac eTransact" ]
[ "P", "U", "M", "U" ]
857
Leveraging an alternative source of computer scientists: reentry programs
Much has been written about the leaky pipeline of women in computer science (CS), with the percentage of women decreasing as one moves from lower levels, such as college, to higher levels, culminating in full professorship. While significant attention focused on keeping women from leaving the pipeline, there is also an opportunity to bring women into the pipeline through non-traditional programs, instead of requiring that everyone enter at the undergraduate level. Both Mills College, a small liberal arts institution for women, and UC Berkeley, a large research university, established programs in the 80's to increase the number of women in computer science by tapping non-traditional students. Both programs share the core value of accommodating older students lacking technical backgrounds. The two programs have produced similar results: graduate degrees earned in computer science by students who would not have qualified without these programs, professional employment in the computer science field by women and minorities, and a recognition that this population represents a rich source of talent for our nation
[ "reentry programs", "women", "computer science", "Mills College", "UC Berkeley", "students", "graduate degrees", "professional employment", "minorities" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
739
Disposable mobiles
After many delays, the reusable, recyclable, disposable mobile phone is finally going on sale in the US. But with a business model largely dependent on niche markets, Elizabeth Biddlecombe asks if these simplified handsets will be good enough to survive a brutal market
[ "reusable", "recyclable", "disposable mobile phone", "simplified handsets" ]
[ "P", "P", "P", "P" ]
1329
PageFlex + MediaRich = PageRich
Layout and graphics innovators collaborate on fully variable combination. Pageflex and Equilibrium have melded their respective EDIT and MediaRich technologies to make a variable-data composition engine with a Web interface. Though a first-generation effort, it shows substantial promise
[ "PageFlex", "MediaRich", "PageRich", "layout", "graphics", "composition", "software houses" ]
[ "P", "P", "P", "P", "P", "P", "U" ]
109
An entanglement measure based on the capacity of dense coding
An asymptotic entanglement measure for any bipartite states is derived in the light of the dense coding capacity optimized with respect to local quantum operations and classical communications. General properties and some examples with explicit forms of this entanglement measure are investigated
[ "entanglement measure", "asymptotic entanglement measure", "bipartite states", "dense coding capacity", "optimization", "local quantum operations", "classical communications" ]
[ "P", "P", "P", "P", "P", "P", "P" ]
1234
Achieving performance under OpenMP on ccNUMA and software distributed shared memory systems
OpenMP is emerging as a viable high-level programming model for shared memory parallel systems. It was conceived to enable easy, portable application development on this range of systems, and it has also been implemented on cache-coherent Non-Uniform Memory Access (ccNUMA) architectures. Unfortunately, it is hard to obtain high performance on the latter architecture, particularly when large numbers of threads are involved. In this paper, we discuss the difficulties faced when writing OpenMP programs for ccNUMA systems, and explain how the vendors have attempted to overcome them. We focus on one such system, the SGI Origin 2000, and perform a variety of experiments designed to illustrate the impact of the vendor's efforts. We compare codes written in a standard, loop-level parallel style under OpenMP with alternative versions written in a Single Program Multiple Data (SPMD) fashion, also realized via OpenMP, and show that the latter consistently provides superior performance. A carefully chosen set of language extensions can help us translate programs from the former style to the latter (or to compile directly, but in a similar manner). Syntax for these extensions can be borrowed from HPF, and some aspects of HPF compiler technology can help the translation process. It is our expectation that an extended language, if well compiled, would improve the attractiveness of OpenMP as a language for high-performance computation on an important class of modern architectures
[ "OpenMP", "programming model", "shared memory parallel systems", "cache-coherent Non-Uniform Memory Access", "Single Program Multiple Data", "HPF", "parallel programming" ]
[ "P", "P", "P", "P", "P", "P", "R" ]
1271
Verification of non-functional properties of a composable architecture with Petri nets
In this paper, we introduce our concept of composability and present the MSS architecture as an example for a composable architecture. MSS claims to be composable with respect to timing properties. We discuss, how to model and prove properties in such an architecture with time-extended Petrinets. As a result, the first step of a proof of composability is presented as well as a new kind of Petri net, which is more suitable for modeling architectures like MSS
[ "composable architecture", "Petri nets", "MSS architecture", "timing properties", "proof of composability", "non-functional properties verification" ]
[ "P", "P", "P", "P", "P", "R" ]
624
A hybrid ML-EM algorithm for calculation of maximum likelihood estimates in semiparametric shared frailty models
This paper describes a generalised hybrid ML-EM algorithm for the calculation of maximum likelihood estimates in semiparametric shared frailty models, the Cox proportional hazard models with hazard functions multiplied by a (parametric) frailty random variable. This hybrid method is much faster than the standard EM method and faster than the standard direct maximum likelihood method (ML, Newton-Raphson) for large samples. We have previously applied this method to semiparametric shared gamma frailty models, and verified by simulations the asymptotic and small sample statistical properties of the frailty variance estimates. Let theta /sub 0/ be the true value of the frailty variance parameter. Then the asymptotic distribution is normal for theta /sub 0/>0 while it is a 50-50 mixture between a point mass at zero and a normal random variable on the positive axis for theta /sub 0/=0. For small samples, simulations suggest that the frailty variance estimates are approximately distributed as an x-(100-x)% mixture, 0<or=x<or=50, between a point mass at zero and a normal random variable on the positive axis even for theta /sub 0/>0. We apply this method and verify by simulations these statistical results for semiparametric shared log-normal frailty models. We also apply the semiparametric shared gamma and log-normal frailty models to Busselton Health Study coronary heart disease data
[ "hybrid ML-EM algorithm", "maximum likelihood estimates", "Cox proportional hazard models", "hazard functions", "simulations", "frailty variance estimates", "asymptotic distribution", "normal random variable", "semiparametric shared log-normal frailty models", "Busselton Health Study", "coronary heart disease data", "data analysis", "normal distribution" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R" ]
661
All change [agile business]
What does it take for an organisation to become an agile business? Its employees probably need to adhere to new procurement policies, work more closely with colleagues in other departments, meet more exacting sales targets, and offer higher standards of customer service and support. In short, they need to change the way they work. Implementing technologies to support agile business models and underpin new practices is a complex task in itself. But getting employees to adopt new practices is far harder, and one that requires careful handling, says Barry O'Connell, general manager of business-to-employee (B2E) solutions at systems vendor Hewlett-Packard (HP)
[ "agile business", "corporate transformation", "organisational change" ]
[ "P", "U", "R" ]
1135
A combinatorial, graph-based solution method for a class of continuous-time optimal control problems
The paper addresses a class of continuous-time, optimal control problems whose solutions are typically characterized by both bang-bang and "singular" control regimes. Analytical study and numerical computation of such solutions are very difficult and far from complete when only techniques from control theory are used. This paper solves optimal control problems by reducing them to the combinatorial search for the shortest path in a specially constructed graph. Since the nodes of the graph are weighted in a sequence-dependent manner, we extend the classical, shortest-path algorithm to our case. The proposed solution method is currently limited to single-state problems with multiple control functions. A production planning problem and a train operation problem are optimally solved to illustrate the method
[ "continuous-time optimal control problems", "numerical computation", "combinatorial search", "sequence-dependent manner", "single-state problems", "multiple control functions", "production planning problem", "train operation problem", "combinatorial graph-based solution", "bang-bang control regimes", "singular control regimes", "shortest path algorithm", "weighted graph nodes" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "R", "R", "R" ]
1170
Upper bound analysis of oblique cutting with nose radius tools
A generalized upper bound model for calculating the chip flow angle in oblique cutting using flat-faced nose radius tools is described. The projection of the uncut chip area on the rake face is divided into a number of elements parallel to an assumed chip flow direction. The length of each of these elements is used to find the length of the corresponding element on the shear surface using the ratio of the shear velocity to the chip velocity. The area of each element is found as the cross product of the length and its width along the cutting edge. Summing up the area of the elements along the shear surface, the total shear surface area is obtained. The friction area is calculated using the similarity between orthogonal and oblique cutting in the 'equivalent' plane that includes both the cutting velocity and chip velocity. The cutting power is obtained by summing the shear power and the friction power. The actual chip flow angle and chip velocity are obtained by minimizing the cutting power with respect to both these variables. The shape of the curved shear surface, the chip cross section and the cutting force obtained from this model are presented
[ "upper bound analysis", "oblique cutting", "nose radius tools", "chip flow angle", "uncut chip area", "shear surface", "shear velocity", "chip velocity", "friction area" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P" ]
560
Citizen centric identity management: chip tricks?
Accelerating and harmonizing the diffusion and acceptance of electronic services in Europe in a secure and practical way has become a priority of several initiatives in the past few years and a critical factor for citizen and business information society services. As identification and authentication is a critical element in accessing public services the combination of public key infrastructure (PKI) and smart cards emerges as the solution of choice for eGovernment in Europe. National governments and private initiatives alike vouch their support for this powerful combination to deliver an essential layer of reliable electronic services and address identity requirements in a broad range of application areas. A recent study suggests that several eGovernment implementations point to the direction of electronic citizen identity management as an up and coming challenge. The paper discusses the eGovernment needs for user identification applicability and the need for standardization
[ "citizen centric identity management", "electronic services", "authentication", "public key infrastructure", "smart cards", "government", "user identification", "standardization", "business information services", "legal framework", "public information services" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "R", "U", "R" ]
1108
The visible cement data set
With advances in x-ray microtomography, it is now possible to obtain three-dimensional representations of a material's microstructure with a voxel size of less than one micrometer. The Visible Cement Data Set represents a collection of 3-D data sets obtained using the European Synchrotron Radiation Facility in Grenoble, France in September 2000. Most of the images obtained are for hydrating portland cement pastes, with a few data sets representing hydrating Plaster of Paris and a common building brick. All of these data sets are being made available on the Visible Cement Data Set website at http://visiblecement.nist.gov. The website includes the raw 3-D datafiles, a description of the material imaged for each data set, example two-dimensional images and visualizations for each data set, and a collection of C language computer programs that will be of use in processing and analyzing the 3-D microstructural images. This paper provides the details of the experiments performed at the ESRF, the analysis procedures utilized in obtaining the data set files, and a few representative example images for each of the three materials investigated
[ "X-ray microtomography", "microstructure", "voxel size", "European Synchrotron Radiation Facility", "hydrating portland cement pastes", "Plaster of Paris", "building brick", "two-dimensional images", "microstructural images", "ESRF", "3D representations", "cement hydration" ]
[ "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "R" ]