diff --git "a/dev.jsonl" "b/dev.jsonl" --- "a/dev.jsonl" +++ "b/dev.jsonl" @@ -156,7 +156,7 @@ {"id": "1761", "title": "Superconvergence of discontinuous Galerkin method for nonstationary hyperbolic equation", "abstract": "For the first order nonstationary hyperbolic equation taking the piecewise linear discontinuous Galerkin solver, we prove that under the uniform rectangular partition, such a discontinuous solver, after postprocessing, can have two and half approximative order which is half order higher than the optimal estimate by P. Lesaint and P. Raviart (1974) under the rectangular partition", "keyphrases": ["superconvergence of discontinuous Galerkin method", "nonstationary hyperbolic equation", "piecewise linear discontinuous Galerkin solver", "rectangular partition", "approximative order"], "prmu": ["P", "P", "P", "P", "P"]} {"id": "1681", "title": "One and two facility network design revisited", "abstract": "The one facility one commodity network design problem (OFOC) with nonnegative flow costs considers the problem of sending d units of flow from a source to a destination where arc capacity is purchased in batches of C units. The two facility problem (TFOC) is similar, but capacity can be purchased either in batches of C units or one unit. Flow costs are zero. These problems are known to be NP-hard. We describe an exact O(n/sup 3/3/sup n/) algorithm for these problems based on the repeated use of a bipartite matching algorithm. We also present a better lower bound of Omega (n/sup 2k*/) for an earlier Omega (n/sup 2k/) algorithm described in the literature where k = [d/C] and k* = min{k, [(n 2)/2]}. The matching algorithm is faster than this one for k >or= [(n - 2)/2]. Finally, we provide another reformulation of the problem that is quasi integral. This property could be useful in designing a modified version of the simplex method to solve the problem using a sequence of pivots with integer extreme solutions, referred to as the integral simplex method in the literature", "keyphrases": ["one facility one commodity network design problem", "two facility network design", "nonnegative flow costs", "flow costs", "NP-hard problems", "exact algorithm", "bipartite matching algorithm", "lower bound", "quasi integral", "pivots", "integral simplex method"], "prmu": ["P", "P", "P", "P", "R", "R", "P", "P", "P", "P", "P"]} {"id": "1538", "title": "A heuristic approach to resource locations in broadband networks", "abstract": "In broadband networks, such as ATM, the importance of dynamic migration of data resources is increasing because of its potential to improve performance especially for transaction processing. In environments with migratory data resources, it is necessary to have mechanisms to manage the locations of each data resource. In this paper, we present an algorithm that makes use of system state information and heuristics to manage locations of data resources in a distributed network. In the proposed algorithm, each site maintains information about state of other sites with respect to each data resource of the system and uses it to find: (1) a subset of sites likely to have the requested data resource; and (2) the site where the data resource is to be migrated from the current site. The proposed algorithm enhances its effectiveness by continuously updating system state information stored at each site. It focuses on reducing the overall average time delay needed by the transaction requests to locate and access the migratory data resources. We evaluated the performance of the proposed algorithm and also compared it with one of the existing location management algorithms, by simulation studies under several system parameters such as the frequency of requests generation, frequency of data resource migrations, network topology and scale of network. The experimental results show the effectiveness of the proposed algorithm in all cases", "keyphrases": ["broadband networks", "ATM", "resource locations", "heuristics", "distributed network", "data resource migrations", "network topology"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} -{"id": "1912", "title": "A novel preterm respiratory mechanics active simulator to test the performances of neonatal pulmonary ventilators", "abstract": "A patient active simulator is proposed which is capable of reproducing values of the parameters of pulmonary mechanics of healthy newborns and preterm pathological infants. The implemented prototype is able to: (a) let the operator choose the respiratory pattern, times of apnea, episodes of cough, sobs, etc., (b) continuously regulate and control the parameters characterizing the pulmonary system; and, finally, (c) reproduce the attempt of breathing of a preterm infant. Taking into account both the limitation due to the chosen application field and the preliminary autocalibration phase automatically carried out by the proposed device, accuracy and reliability on the order of 1% is estimated. The previously indicated value has to be considered satisfactory in light of the field of application and the small values of the simulated parameters. Finally, the achieved metrological characteristics allow the described neonatal simulator to be adopted as a reference device to test performances of neonatal ventilators and, more specifically, to measure the time elapsed between the occurrence of a potentially dangerous condition to the patient and the activation of the corresponding alarm of the tested ventilator", "keyphrases": ["preterm respiratory mechanics active simulator", "neonatal pulmonary ventilators", "patient active simulator", "healthy newborns", "preterm pathological infants", "apnea times", "autocalibration phase", "accuracy", "reliability", "respiratory diseases", "ventilatory support", "intensive care equipment", "electronic unit", "pneumatic/mechanical unit", "software control", "double compartment model", "artificial trachea", "pressure transducer", "variable clamp resistance", "upper airway resistance", "compliance"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "P", "P", "M", "U", "U", "U", "U", "M", "U", "U", "U", "U", "U", "U"]} +{"id": "1912", "title": "A novel preterm respiratory mechanics active simulator to test the performances of neonatal pulmonary ventilators", "abstract": "A patient active simulator is proposed which is capable of reproducing values of the parameters of pulmonary mechanics of healthy newborns and preterm pathological infants. The implemented prototype is able to: (a) let the operator choose the respiratory pattern, times of apnea, episodes of cough, sobs, etc., (b) continuously regulate and control the parameters characterizing the pulmonary system; and, finally, (c) reproduce the attempt of breathing of a preterm infant. Taking into account both the limitation due to the chosen application field and the preliminary autocalibration phase automatically carried out by the proposed device, accuracy and reliability on the order of 1% is estimated. The previously indicated value has to be considered satisfactory in light of the field of application and the small values of the simulated parameters. Finally, the achieved metrological characteristics allow the described neonatal simulator to be adopted as a reference device to test performances of neonatal ventilators and, more specifically, to measure the time elapsed between the occurrence of a potentially dangerous condition to the patient and the activation of the corresponding alarm of the tested ventilator", "keyphrases": ["preterm respiratory mechanics active simulator", "neonatal pulmonary ventilators", "patient active simulator", "healthy newborns", "preterm pathological infants", "apnea times", "autocalibration phase", "accuracy", "reliability", "respiratory diseases", "ventilatory support", "intensive care equipment", "electronic unit", "pneumatic/mechanical unit", "software control", "double compartment model", "artificial trachea", "pressure transducer", "variable clamp resistance", "upper airway resistance", "compliance"], "prmu": ["P", "P", "P", "P", "P", "R", "P", "P", "P", "M", "U", "U", "U", "M", "M", "U", "U", "U", "U", "U", "U"]} {"id": "190", "title": "On the design of gain-scheduled trajectory tracking controllers [AUV application]", "abstract": "A new methodology is proposed for the design of trajectory tracking controllers for autonomous vehicles. The design technique builds on gain scheduling control theory. An application is made to the design of a trajectory tracking controller for a prototype autonomous underwater vehicle (AUV). The effectiveness and advantages of the new control laws derived are illustrated in simulation using a full set of non-linear equations of motion of the vehicle", "keyphrases": ["gain-scheduled trajectory tracking controller design", "autonomous vehicles", "gain scheduling control theory", "autonomous underwater vehicle", "control laws", "nonlinear equations of motion"], "prmu": ["R", "P", "P", "P", "P", "M"]} {"id": "1639", "title": "New hub gears up for algorithmic exchange", "abstract": "Warwick University in the UK is on the up and up. Sometimes considered a typical 1960s, middle-of-the-road redbrick institution-not known for their distinction the 2001 UK Research Assessment Exercise (RAE) shows its research to be the fifth most highly-rated in the country, with outstanding standards in the sciences. This impressive performance has rightly given Warwick a certain amount of muscle, which it is flexing rather effectively, aided by a snappy approach to making things happen that leaves some older institutions standing. The result is a brand new Centre for Scientific Computing (CSC), launched within a couple of years of its initial conception", "keyphrases": ["Warwick University Centre for Scientific Computing"], "prmu": ["R"]} {"id": "1641", "title": "Development through gaming", "abstract": "Mainstream observers commonly underestimate the role of fringe activities in propelling science and technology. Well-known examples are how wars have fostered innovation in areas such as communications, cryptography, medicine and aerospace; and how erotica has been a major factor in pioneering visual media, from the first printed books to photography, cinematography, videotape, or the latest online video streaming. The article aims to be a sampler of a less controversial, but still often underrated, symbiosis between scientific computing and computing for leisure and entertainment", "keyphrases": ["computer games", "scientific computing", "leisure", "entertainment", "graphics"], "prmu": ["R", "P", "P", "P", "U"]} @@ -200,11 +200,11 @@ {"id": "1619", "title": "Rate allocation for video transmission over lossy correlated networks", "abstract": "A novel rate allocation algorithm for video transmission over lossy networks subject to bursty packet losses is presented. A Gilbert-Elliot model is used at the encoder to drive the selection of coding parameters. Experimental results using the H.26L test model show a significant performance improvement with respect to the assumption of independent packet losses", "keyphrases": ["rate allocation algorithm", "video transmission", "lossy correlated networks", "bursty packet losses", "Gilbert-Elliot model", "coding parameters", "H.26L test model", "video coding"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R"]} {"id": "1704", "title": "Statistical analysis of nonlinearly reconstructed near-infrared tomographic images. I. Theory and simulations", "abstract": "Near-infrared (NIR) diffuse tomography is an emerging method for imaging the interior of tissues to quantify concentrations of hemoglobin and exogenous chromophores noninvasively in vivo. It often exploits an optical diffusion model-based image reconstruction algorithm to estimate spatial property values from measurements of the light flux at the surface of the tissue. In this study, mean-squared error (MSE) over the image is used to evaluate methods for regularizing the ill-posed inverse image reconstruction problem in NIR tomography. Estimates of image bias and image standard deviation were calculated based upon 100 repeated reconstructions of a test image with randomly distributed noise added to the light flux measurements. It was observed that the bias error dominates at high regularization parameter values while variance dominates as the algorithm is allowed to approach the optimal solution. This optimum does not necessarily correspond to the minimum projection error solution, but typically requires further iteration with a decreasing regularization parameter to reach the lowest image error. Increasing measurement noise causes a need to constrain the minimum regularization parameter to higher values in order to achieve a minimum in the overall image MSE", "keyphrases": ["medical diagnostic imaging", "hemoglobin", "oxygen saturation", "photon migration", "optical diffusion model-based image reconstruction algorithm", "decreasing regularization parameter", "lowest image error", "minimum regularization parameter constraint", "bias error", "optimal solution", "light flux", "mean-squared error", "ill-posed inverse image reconstruction problem regularization", "spatial property values estimation", "test image", "randomly distributed noise", "O/sub 2/"], "prmu": ["M", "P", "U", "U", "P", "P", "P", "M", "P", "P", "P", "P", "R", "R", "P", "P", "U"]} {"id": "1741", "title": "The top cycle and uncovered solutions for weak tournaments", "abstract": "We study axiomatic properties of the top cycle and uncovered solutions for weak tournaments. Subsequently, we establish its connection with the rational choice theory", "keyphrases": ["top cycle", "uncovered solutions", "weak tournaments", "axiomatic properties", "rational choice theory"], "prmu": ["P", "P", "P", "P", "P"]} -{"id": "1897", "title": "User-appropriate tyre-modelling for vehicle dynamics in standard and limit situations", "abstract": "When modelling vehicles for the vehicle dynamic simulation, special attention must be paid to the modelling of tyre forces and -torques, according to their dominant influence on the results. This task is not only about sufficiently exact representation of the effective forces but also about user-friendly and practical relevant applicability, especially when the experimental tyre-input-data is incomplete or missing. This text firstly describes the basics of the vehicle dynamic tyre model, conceived to be a physically based, semi-empirical model for application in connection with multi-body-systems (MBS). On the basis of tyres for a passenger car and a heavy truck the simulated steady state tyre characteristics are shown together and compared with the underlying experimental values. The possibility to link the tyre model TMeasy to any MBS-program is described, as far as it supports the 'Standard Tyre Interface'. As an example, the simulated and experimental data of a heavy truck doing a standardized driving manoeuvre are compared", "keyphrases": ["tyre modelling", "vehicle dynamics", "standard situations", "limit situations", "tyre torques", "semi-empirical model", "multi-body-systems", "passenger car", "heavy truck", "simulated steady state tyre characteristics", "TMeasy", "Standard Tyre Interface", "standardized driving manoeuvre"], "prmu": ["P", "P", "R", "P", "M", "P", "P", "P", "P", "P", "P", "R", "P"]} +{"id": "1897", "title": "User-appropriate tyre-modelling for vehicle dynamics in standard and limit situations", "abstract": "When modelling vehicles for the vehicle dynamic simulation, special attention must be paid to the modelling of tyre forces and -torques, according to their dominant influence on the results. This task is not only about sufficiently exact representation of the effective forces but also about user-friendly and practical relevant applicability, especially when the experimental tyre-input-data is incomplete or missing. This text firstly describes the basics of the vehicle dynamic tyre model, conceived to be a physically based, semi-empirical model for application in connection with multi-body-systems (MBS). On the basis of tyres for a passenger car and a heavy truck the simulated steady state tyre characteristics are shown together and compared with the underlying experimental values. The possibility to link the tyre model TMeasy to any MBS-program is described, as far as it supports the 'Standard Tyre Interface'. As an example, the simulated and experimental data of a heavy truck doing a standardized driving manoeuvre are compared", "keyphrases": ["tyre modelling", "vehicle dynamics", "standard situations", "limit situations", "tyre torques", "semi-empirical model", "multi-body-systems", "passenger car", "heavy truck", "simulated steady state tyre characteristics", "TMeasy", "Standard Tyre Interface", "standardized driving manoeuvre"], "prmu": ["P", "P", "R", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1916", "title": "Changes in the entropy and the Tsallis difference information during spontaneous decay and self-organization of nonextensive systems", "abstract": "A theoretical-information description of self-organization processes during stimulated transitions between stationary states of open nonextensive systems is presented. S/sub q/- and I/sub q/-theorems on changes of the entropy and Tsallis difference information measures in the process of evolution in the space of control parameters are proved. The entropy and the Tsallis difference information are derived and their new extreme properties are discussed", "keyphrases": ["entropy", "Tsallis difference information", "spontaneous decay", "self-organization", "nonextensive systems", "stimulated transitions", "information measures", "control parameters", "nonextensive statistical mechanics"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "M"]} {"id": "1584", "title": "Content all clear [workflow & content management]", "abstract": "Graeme Muir of SchlumbergerSema cuts through the confusion between content, document and records management", "keyphrases": ["SchlumbergerSema", "content management", "document management", "records management"], "prmu": ["P", "P", "R", "P"]} {"id": "1678", "title": "Parallel interior point schemes for solving multistage convex programming", "abstract": "The predictor-corrector interior-point path-following algorithm is promising in solving multistage convex programming problems. Among many other general good features of this algorithm, especially attractive is that the algorithm allows the possibility to parallelise the major computations. The dynamic structure of the multistage problems specifies a block-tridiagonal system at each Newton step of the algorithm. A wrap-around permutation is then used to implement the parallel computation for this step", "keyphrases": ["parallel interior point schemes", "multistage convex programming", "predictor-corrector interior-point path-following algorithm", "dynamic structure", "block-tridiagonal system", "Newton step", "wrap-around permutation", "parallel computation"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} -{"id": "1685", "title": "Use of web technologies in construction project management: what are the critical success/failure factors?", "abstract": "A concept of how the World Wide Web (WWW) and its associated technologies can be used to manage construction projects has been recognized by practitioners in the construction industry for quite sometime. This concept is often referred to as a Web-Based Project Management System (WPMS). It promises, to enhance construction project documentation and control, and to revolutionize the way construction project teams process and transmit project information. WPMS is an electronic project-management system conducted through the Internet. The system provides a centralized, commonly accessible, reliable means of transmitting and storing project information. Project information is stored on the server and a standard Web browser is used as the gateway to exchange this information, eliminating geographic and hardware platforms boundary", "keyphrases": ["Web-Based Project Management System", "construction industry", "project documentation", "project control", "success", "implementation", "Web browser"], "prmu": ["P", "P", "P", "R", "U", "U", "P"]} +{"id": "1685", "title": "Use of web technologies in construction project management: what are the critical success/failure factors?", "abstract": "A concept of how the World Wide Web (WWW) and its associated technologies can be used to manage construction projects has been recognized by practitioners in the construction industry for quite sometime. This concept is often referred to as a Web-Based Project Management System (WPMS). It promises, to enhance construction project documentation and control, and to revolutionize the way construction project teams process and transmit project information. WPMS is an electronic project-management system conducted through the Internet. The system provides a centralized, commonly accessible, reliable means of transmitting and storing project information. Project information is stored on the server and a standard Web browser is used as the gateway to exchange this information, eliminating geographic and hardware platforms boundary", "keyphrases": ["Web-Based Project Management System", "construction industry", "project documentation", "project control", "success", "implementation", "Web browser"], "prmu": ["P", "P", "P", "R", "P", "U", "P"]} {"id": "169", "title": "MRP in a job shop environment using a resource constrained project scheduling model", "abstract": "One of the most difficult tasks in a job shop manufacturing environment is to balance schedule and capacity in an ongoing basis. MRP systems are commonly used for scheduling, although their inability to deal with capacity constraints adequately is a severe drawback. In this study, we show that material requirements planning can be done more effectively in a job shop environment using a resource constrained project scheduling model. The proposed model augments MRP models by incorporating capacity constraints and using variable lead time lengths. The efficacy of this approach is tested on MRP systems by comparing the inventory carrying costs and resource allocation of the solutions obtained by the proposed model to those obtained by using a traditional MRP model. In general, it is concluded that the proposed model provides improved schedules with considerable reductions in inventory carrying costs", "keyphrases": ["job shop environment", "MRP", "resource constrained project scheduling model", "material requirements planning", "scheduling", "capacity constraints", "variable lead time lengths", "inventory carrying costs", "resource allocation", "project management"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M"]} {"id": "1798", "title": "Robustness evaluation of a minimal RBF neural network for nonlinear-data-storage-channel equalisation", "abstract": "The authors present a performance-robustness evaluation of the recently developed minimal resource allocation network (MRAN) for equalisation in highly nonlinear magnetic recording channels in disc storage systems. Unlike communication systems, equalisation of signals in these channels is a difficult problem, as they are corrupted by data-dependent noise and highly nonlinear distortions. Nair and Moon (1997) have proposed a maximum signal to distortion ratio (MSDR) equaliser for data storage channels, which uses a specially designed neural network, where all the parameters of the neural network are determined theoretically, based on the exact knowledge of the channel model parameters. In the present paper, the performance of the MSDR equaliser is compared with that of the MRAN equaliser using a magnetic recording channel model, under Conditions that include variations in partial erasure, jitter, width and noise power, as well as model mismatch. Results from the study indicate that the less complex MRAN equaliser gives consistently better performance robustness than the MSDR equaliser in terms of signal to distortion ratios (SDRs)", "keyphrases": ["robustness evaluation", "minimal resource allocation network", "highly nonlinear magnetic recording channels", "disc storage systems", "nonlinear-data-storage-channel equalisation", "data-dependent noise", "highly nonlinear distortions", "maximum signal to distortion ratio equaliser", "RBF neural network", "MRAN equaliser", "MSDR equaliser", "digital magnetic recording", "jitter noise"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "M", "R"]} {"id": "1464", "title": "LR parsing for conjunctive grammars", "abstract": "The generalized LR parsing algorithm for context-free grammars, introduced by Tomita in 1986, is a polynomial-time implementation of nondeterministic LR parsing that uses graph-structured stack to represent the contents of the nondeterministic parser's pushdown for all possible branches of computation at a single computation step. It has been specifically developed as a solution for practical parsing tasks arising in computational linguistics, and indeed has proved itself to be very suitable for natural language processing. Conjunctive grammars extend context-free grammars by allowing the use of an explicit intersection operation within grammar rules. This paper develops a new LR-style parsing algorithm for these grammars, which is based on the very same idea of a graph-structured pushdown, where the simultaneous existence of several paths in the graph is used to perform the mentioned intersection operation. The underlying finite automata are treated in the most general way: instead of showing the algorithm's correctness for some particular way of constructing automata, the paper defines a wide class of automata usable with a given grammar, which includes not only the traditional LR(k) automata, but also, for instance, a trivial automaton with a single reachable state. A modification of the SLR(k) table construction method that makes use of specific properties of conjunctive grammars is provided as one possible way of making finite automata to use with the algorithm", "keyphrases": ["conjunctive grammars", "generalized LR parsing algorithm", "graph-structured stack", "nondeterministic parser pushdown", "computation", "computational linguistics", "natural language processing", "context-free grammars", "explicit intersection operation", "grammar rules", "finite automata", "trivial automaton", "single reachable state", "Boolean closure", "deterministic context-free languages"], "prmu": ["P", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "M"]} @@ -226,7 +226,7 @@ {"id": "156", "title": "Using extended logic programming for alarm-correlation in cellular phone networks", "abstract": "Alarm correlation is a necessity in large mobile phone networks, where the alarm bursts resulting from severe failures would otherwise overload the network operators. We describe how to realize alarm-correlation in cellular phone networks using extended logic programming. To this end, we describe an algorithm and system solving the problem, a model of a mobile phone network application, and a detailed solution for a specific scenario", "keyphrases": ["extended logic programming", "alarm-correlation", "cellular phone networks", "large mobile phone networks", "network operators", "fault diagnosis"], "prmu": ["P", "P", "P", "P", "P", "U"]} {"id": "1546", "title": "Necessary conditions of optimality for impulsive systems on Banach spaces", "abstract": "We present necessary conditions of optimality for optimal control problems arising in systems governed by impulsive evolution equations on Banach spaces. Basic notations and terminologies are first presented and necessary conditions of optimality are presented. Special cases are discussed and we present an application to the classical linear quadratic regulator problem", "keyphrases": ["linear quadratic regulator", "optimality", "impulsive systems", "optimal control", "impulsive evolution equations", "Banach spaces", "necessary conditions"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} {"id": "1503", "title": "Neural networks for web content filtering", "abstract": "With the proliferation of harmful Internet content such as pornography, violence, and hate messages, effective content-filtering systems are essential. Many Web-filtering systems are commercially available, and potential users can download trial versions from the Internet. However, the techniques these systems use are insufficiently accurate and do not adapt well to the ever-changing Web. To solve this problem, we propose using artificial neural networks to classify Web pages during content filtering. We focus on blocking pornography because it is among the most prolific and harmful Web content. However, our general framework is adaptable for filtering other objectionable Web material", "keyphrases": ["artificial neural networks", "Intelligent Classification Engine", "learning capabilities", "pornographic/nonpornographic Web page differentiation", "Web content filtering", "violence", "Web page classification", "harmful Web content"], "prmu": ["P", "U", "U", "M", "P", "P", "M", "P"]} -{"id": "1687", "title": "Cleared for take-off [Hummingbird Enterprise]", "abstract": "A recent Gartner report identifies Hummingbird in the first wave of vendors as an early example of convergence in the 'smart enterprise suite' market. We spoke to Hummingbird's Marketing Director for Northern Europe", "keyphrases": ["smart enterprise suite", "Hummingbird Enterprise", "information content", "knowledge content", "collaboration"], "prmu": ["M", "P", "U", "U", "U"]} +{"id": "1687", "title": "Cleared for take-off [Hummingbird Enterprise]", "abstract": "A recent Gartner report identifies Hummingbird in the first wave of vendors as an early example of convergence in the 'smart enterprise suite' market. We spoke to Hummingbird's Marketing Director for Northern Europe", "keyphrases": ["smart enterprise suite", "Hummingbird Enterprise", "information content", "knowledge content", "collaboration"], "prmu": ["P", "P", "U", "U", "U"]} {"id": "1914", "title": "Vacuum-compatible vibration isolation stack for an interferometric gravitational wave detector TAMA300", "abstract": "Interferometric gravitational wave detectors require a large degree of vibration isolation. For this purpose, a multilayer stack constructed of rubber and metal blocks is suitable, because it provides isolation in all degrees of freedom at once. In TAMA300, a 300 m interferometer in Japan, long-term dimensional stability and compatibility with an ultrahigh vacuum environment of about 10/sup -6/ Pa are also required. To keep the interferometer at its operating point despite ground strain and thermal drift of the isolation system, a thermal actuator was introduced. To prevent the high outgassing rate of the rubber from spoiling the vacuum, the rubber blocks were enclosed by gas-tight bellows. Using these techniques, we have successfully developed a three-layer stack which has a vibration isolation ratio of more than 10/sup 3/ at 300 Hz with control of drift and enough vacuum compatibility", "keyphrases": ["vibration isolation stack", "TAMA300 interferometer", "interferometric gravitational wave detectors", "rubber blocks", "multilayer stack", "metal blocks", "long-term dimensional stability", "ultrahigh vacuum environment", "operating point", "ground strain", "thermal drift", "thermal actuator", "gas-tight bellows", "rubber outgassing", "vacuum compatibility", "300 m", "10/sup -6/ Pa", "300 Hz"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "P", "P"]} {"id": "1767", "title": "Bivariate fractal interpolation functions on rectangular domains", "abstract": "Non-tensor product bivariate fractal interpolation functions defined on gridded rectangular domains are constructed. Linear spaces consisting of these functions are introduced. The relevant Lagrange interpolation problem is discussed. A negative result about the existence of affine fractal interpolation functions defined on such domains is obtained", "keyphrases": ["bivariate fractal interpolation functions", "rectangular domains", "gridded rectangular domains", "linear spaces", "Lagrange interpolation problem", "affine fractal interpolation functions"], "prmu": ["P", "P", "P", "P", "P", "P"]} {"id": "1809", "title": "Approach to adaptive neural net-based H/sub infinity / control design", "abstract": "An approach is investigated for the adaptive neural net-based H/sub infinity / control design of a class of nonlinear uncertain systems. In the proposed framework, two multilayer feedforward neural networks are constructed as an alternative to approximate the nonlinear system. The neural networks are piecewisely interpolated to generate a linear differential inclusion model by which a linear state feedback H/sub infinity / control law can be applied. An adaptive weight adjustment mechanism for the multilayer feedforward neural networks is developed to ensure H/sub infinity / regulation performance. It is shown that finding the control gain matrices can be transformed into a standard linear matrix inequality problem and solved via a developed recurrent neural network", "keyphrases": ["adaptive neural net-based H/sub infinity / control design", "nonlinear uncertain systems", "multilayer feedforward neural networks", "piecewise interpolation", "linear differential inclusion model", "linear state feedback", "control gain matrices", "linear matrix inequality problem", "recurrent neural network", "LMI"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "U"]} @@ -260,7 +260,7 @@ {"id": "1821", "title": "Performance, design and control of a series-parallel (CL/sup 2/-type) resonant DC/DC converter", "abstract": "The three-element resonant network has various topological alternatives, one of which, a prospective compound topology, is investigated in detail. The converter uses one capacitor (C) and two inductors (L/sup 2/), to form a compound type CL/sup 2/ network. Various advantages and limitations of the converter are detailed, and a new design procedure for such converters is also introduced. The converter may be controlled by varying the switching frequency or by pulse-width modulation. An experimental prototype has been produced and an excellent performance in the lagging power-factor mode has been confirmed", "keyphrases": ["series-parallel resonant DC/DC power converter", "three-element resonant network", "capacitor", "inductors", "design procedure", "switching frequency", "pulse-width modulation", "performance", "lagging power factor mode"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P", "M"]} {"id": "1864", "title": "Cultural differences in developers' perceptions of information systems success factors: Japan vs. the United States", "abstract": "The study examined the perceptions of information systems (IS) developers from Japan and the United States regarding the strategies that are considered most important for successful implementation of an IS. The results of principal component analysis revealed that the IS strategies could be reduced to five components: (1) characteristics of the team members, (2) characteristics of the project leader, (3) management/user input, (4) proper technology, and (5) communication. The results indicated that there was a significant difference in the perceptions of Japanese and US developers with respect to the importance of the five components. Japanese developers perceived the project leader as the most crucial component for determining the success of an IS project. Team member characteristics was viewed as the least important by Japanese developers. On the other hand, developers from the US viewed communications as the most critical component. Project leader characteristics were perceived to be the least important by US developers. The results were discussed in terms of cultural differences", "keyphrases": ["cultural differences", "information systems success factors", "information systems developer perceptions", "Japan", "United States", "principal component analysis", "team member characteristics", "project leader characteristics", "management/user input", "proper technology", "communication", "IS project"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1532", "title": "Dedekind zeta-functions and Dedekind sums", "abstract": "In this paper we use Dedekind zeta functions of two real quadratic number fields at -1 to denote Dedekind sums of high rank. Our formula is different from that of Siegel's (1969). As an application, we get a polynomial representation of zeta /sub K/(-1) = zeta /sub K/(-1) = 1/45(26n/sup 3/ - 41n +or- 9), n identical to +or-2(mod 5), where K = Q( square root (5q)), prime q = 4n/sup 2/ + 1, and the class number of quadratic number field K/sub 2/ = Q( square root q) is 1", "keyphrases": ["Dedekind zeta-functions", "Dedekind sums", "real quadratic number fields", "polynomial representation"], "prmu": ["P", "P", "P", "P"]} -{"id": "167", "title": "Business school research: bridging the gap between producers and consumers", "abstract": "There has been a great deal of continuing discussion concerning the seemingly unbridgeable gap between so much of the research produced by business school professors and the needs of the business people who, ideally, would use it. Here, we examine this gap and suggest a model for bridging it. We sample four groups of people, business school academics (professors), deans of business schools, executive MBA students/recent graduates, and senior business executives. Each group rates 44 different (potential) properties of exemplary research. We analyze within-group differences, and more meaningfully, between-group differences. We then offer commentary on the results and use the results to develop the aforementioned suggestions for bridging the gap we find", "keyphrases": ["business school", "producers", "consumers", "professors", "business people", "academics", "deans", "students", "recent graduates", "senior business executives", "exemplary research", "within-group differences", "between-group differences", "ANOVA", "coefficient of concordance", "multiple comparison testing"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "U", "M", "P", "P", "P", "P", "U", "M", "U"]} +{"id": "167", "title": "Business school research: bridging the gap between producers and consumers", "abstract": "There has been a great deal of continuing discussion concerning the seemingly unbridgeable gap between so much of the research produced by business school professors and the needs of the business people who, ideally, would use it. Here, we examine this gap and suggest a model for bridging it. We sample four groups of people, business school academics (professors), deans of business schools, executive MBA students/recent graduates, and senior business executives. Each group rates 44 different (potential) properties of exemplary research. We analyze within-group differences, and more meaningfully, between-group differences. We then offer commentary on the results and use the results to develop the aforementioned suggestions for bridging the gap we find", "keyphrases": ["business school", "producers", "consumers", "professors", "business people", "academics", "deans", "students", "recent graduates", "senior business executives", "exemplary research", "within-group differences", "between-group differences", "ANOVA", "coefficient of concordance", "multiple comparison testing"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "U", "M", "U"]} {"id": "1918", "title": "Negotiating the semantics of agent communication languages", "abstract": "This article presents a formal framework and outlines a method that autonomous agents can use to negotiate the semantics of their communication language at run-time. Such an ability is needed in open multi-agent systems so that agents can ensure they understand the implications of the utterances that are being made and so that they can tailor the meaning of the primitives to best fit their prevailing circumstances. To this end, the semantic space framework provides a systematic means of classifying the primitives along multiple relevant dimensions. This classification can then be used by the agents to structure their negotiation (or semantic fixing) process so that they converge to the mutually agreeable semantics that are necessary for coherent social interactions", "keyphrases": ["autonomous agents", "multi-agent systems", "communication language", "semantic fixing", "semantic space", "social interactions"], "prmu": ["P", "P", "P", "P", "P", "P"]} {"id": "1633", "title": "48 Gbit/s InP DHBT MS-DFF with very low time jitter", "abstract": "A master-slave D-type flip-flop (MS DFF) fabricated in a self-aligned InP DHBT technology is presented. The packaged circuit shows full-rate clock operation at 48 Gbit/s. Very low time jitter and good retiming capabilities are observed. Layout aspects, packaging and measurement issues are discussed in particular", "keyphrases": ["DHBT MS-DFF", "InP DHBT technology", "low time jitter", "master-slave D-type flip-flop", "self-aligned DHBT technology", "packaged circuit", "retiming capabilities", "layout aspects", "48 Gbit/s", "InP"], "prmu": ["P", "P", "P", "P", "R", "P", "P", "P", "P", "P"]} {"id": "1676", "title": "An optimization based approach to the train operator scheduling problem at Singapore MRT", "abstract": "Singapore Mass Rapid Transit (SMRT) operates two train lines with 83 kilometers of track and 48 stations. A total of 77 trains are in operation during peak hours and 41 during off-peak hours. We report on an optimization based approach to develop a computerized train-operator scheduling system that has been implemented at SMRT. The approach involves a bipartite matching algorithm for the generation of night duties and a tabu search algorithm for the generation of day duties. The system automates the train-operator scheduling process at SMRT and produces favorable schedules in comparison with the manual process. It is also able to handle the multiple objectives inherent in the crew scheduling system. While trying to minimize the system wide crew-related costs, the system is also able to address concern with respect to the number of split duties", "keyphrases": ["optimization based approach", "Singapore Mass Rapid Transit", "computerized train-operator scheduling system", "bipartite matching algorithm", "night duties", "tabu search algorithm", "day duties", "crew scheduling system"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} @@ -307,28 +307,28 @@ {"id": "1628", "title": "Quasi-Newton algorithm for adaptive minor component extraction", "abstract": "An adaptive quasi-Newton algorithm is first developed to extract a single minor component corresponding to the smallest eigenvalue of a stationary sample covariance matrix. A deflation technique instead of the commonly used inflation method is then applied to extract the higher-order minor components. The algorithm enjoys the advantage of having a simpler computational complexity and a highly modular and parallel structure for efficient implementation. Simulation results are given to demonstrate the effectiveness of the proposed algorithm for extracting multiple minor components adaptively", "keyphrases": ["quasi-Newton algorithm", "adaptive minor component extraction", "eigenvalue", "stationary sample covariance matrix", "deflation technique", "higher-order minor components", "computational complexity", "modular structure", "parallel structure", "simulation results", "adaptive estimation", "DOA estimation", "ROOT-MUSIC estimator"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "M", "U", "U"]} {"id": "1529", "title": "Quantized-State Systems: A DEVS-approach for continuous system simulation", "abstract": "A new class of dynamical systems, Quantized State Systems or QSS, is introduced in this paper. QSS are continuous time systems where the input trajectories are piecewise constant functions and the state variable trajectories - being themselves piecewise linear functions - are converted into piecewise constant functions via a quantization function equipped with hysteresis. It is shown that QSS can be exactly represented and simulated by a discrete event model, within the framework of the DEVS-approach. Further, it is shown that QSS can be used to approximate continuous systems, thus allowing their discrete-event simulation in opposition to the classical discrete-time simulation. It is also shown that in an approximating QSS, some stability properties of the original system are conserved and the solutions of the QSS go to the solutions of the original system when the quantization goes to zero", "keyphrases": ["dynamical systems", "Quantized State Systems", "continuous time systems", "piecewise constant functions", "discrete event model", "discrete-event simulation"], "prmu": ["P", "P", "P", "P", "P", "P"]} {"id": "1827", "title": "Gossip is synteny: Incomplete gossip and the syntenic distance between genomes", "abstract": "The syntenic distance between two genomes is given by the minimum number of fusions, fissions, and translocations required to transform one into the other, ignoring the order of genes within chromosomes. Computing this distance is NP-hard. In the present work, we give a tight connection between syntenic distance and the incomplete gossip problem, a novel generalization of the classical gossip problem. In this problem, there are n gossipers, each with a unique piece of initial information; they communicate by phone calls in which the two participants exchange all their information. The goal is to minimize the total number of phone calls necessary to inform each gossiper of his set of relevant gossip which he desires to learn. As an application of the connection between syntenic distance and incomplete gossip, we derive an O(2/sup O(n log n)/) algorithm to exactly compute the syntenic distance between two genomes with at most n chromosomes each. Our algorithm requires O(n/sup 2/+2/sup O(d log d)/) time when this distance is d, improving the O(n/sup 2/+2(O(d//sup 2/))) running time of the best previous exact algorithm", "keyphrases": ["syntenic distance", "genomes", "NP-hard", "incomplete gossip problem", "comparative genomics", "running time", "chromosomes"], "prmu": ["P", "P", "P", "P", "M", "P", "P"]} -{"id": "1862", "title": "Global comparison of stages of growth based on critical success factors", "abstract": "With increasing globalization of business, the management of IT in international organizations is faced with the complex task of dealing with the difference between local and international IT needs. This study evaluates, and compares, the level of IT maturity and the critical success factors (CSFs) in selected geographic regions, namely, Norway, Australia/New Zealand, North America, Europe, Asia/Pacific, and India. The results show that significant differences in the IT management needs in these geographic regions exist, and that the IT management operating in these regions must balance the multiple critical success factors for achieving an optimal local-global mix for business success", "keyphrases": ["business globalization", "IT management", "international IT needs", "local IT needs", "IT maturity", "critical success factors", "Norway", "Australia", "New Zealand", "North America", "Europe", "Asia/Pacific", "India", "optimal local-global mix", "business success"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "U", "M", "P", "P", "P", "P", "P", "P"]} +{"id": "1862", "title": "Global comparison of stages of growth based on critical success factors", "abstract": "With increasing globalization of business, the management of IT in international organizations is faced with the complex task of dealing with the difference between local and international IT needs. This study evaluates, and compares, the level of IT maturity and the critical success factors (CSFs) in selected geographic regions, namely, Norway, Australia/New Zealand, North America, Europe, Asia/Pacific, and India. The results show that significant differences in the IT management needs in these geographic regions exist, and that the IT management operating in these regions must balance the multiple critical success factors for achieving an optimal local-global mix for business success", "keyphrases": ["business globalization", "IT management", "international IT needs", "local IT needs", "IT maturity", "critical success factors", "Norway", "Australia", "New Zealand", "North America", "Europe", "Asia/Pacific", "India", "optimal local-global mix", "business success"], "prmu": ["R", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1749", "title": "Advanced aerostatic stability analysis of cable-stayed bridges using finite-element method", "abstract": "Based on the concept of limit point instability, an advanced nonlinear finite-element method that can be used to analyze the aerostatic stability of cable-stayed bridges is proposed. Both geometric nonlinearity and three components of wind loads are considered in this method. The example bridge is the second Santou Bay cable-stayed bridge with a main span length of 518 m built in China. Aerostatic stability of the example bridge is investigated using linear and proposed methods. The effect of pitch moment coefficient on the aerostatic stability of the bridge has been studied. The results show that the aerostatic instability analyses of cable-stayed bridges based on the linear method considerably overestimate the wind-resisting capacity of cable-stayed bridges. The proposed method is highly accurate and efficient. Pitch moment coefficient has a major effect on the aerostatic stability of cable-stayed bridges. Finally, the aerostatic failure mechanism of cable-stayed bridges is explained by tracing the aerostatic instability path", "keyphrases": ["limit point instability", "advanced nonlinear finite element method", "advanced aerostatic stability analysis", "cable-stayed bridges", "geometric nonlinearity", "wind loads", "Santou Bay cable-stayed bridge", "China", "pitch moment coefficient", "aerostatic failure mechanism"], "prmu": ["P", "M", "P", "P", "P", "P", "P", "P", "P", "P"]} -{"id": "1611", "title": "Data mining business intelligence for competitive advantage", "abstract": "Organizations have lately realized that just processing transactions and/or information faster and more efficiently no longer provides them with a competitive advantage vis-a-vis their competitors for achieving business excellence. Information technology (IT) tools that are oriented towards knowledge processing can provide the edge that organizations need to survive and thrive in the current era of fierce competition. Enterprises are no longer satisfied with business information system(s); they require business intelligence system(s). The increasing competitive pressures and the desire to leverage information technology techniques have led many organizations to explore the benefits of new emerging technology, data warehousing and data mining. The paper discusses data warehouses and data mining tools and applications", "keyphrases": ["business intelligence", "competitive advantage", "organizations", "information technology", "knowledge processing", "business information system", "data warehouses", "data mining"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1611", "title": "Data mining business intelligence for competitive advantage", "abstract": "Organizations have lately realized that just processing transactions and/or information faster and more efficiently no longer provides them with a competitive advantage vis-a-vis their competitors for achieving business excellence. Information technology (IT) tools that are oriented towards knowledge processing can provide the edge that organizations need to survive and thrive in the current era of fierce competition. Enterprises are no longer satisfied with business information system(s); they require business intelligence system(s). The increasing competitive pressures and the desire to leverage information technology techniques have led many organizations to explore the benefits of new emerging technology, data warehousing and data mining. The paper discusses data warehouses and data mining tools and applications", "keyphrases": ["business intelligence", "competitive advantage", "organizations", "information technology", "knowledge processing", "business information system", "data warehouses", "data mining"], "prmu": ["P", "P", "P", "P", "P", "M", "P", "P"]} {"id": "1654", "title": "Numerical validation of solutions of complementarity problems: the nonlinear case", "abstract": "This paper proposes a validation method for solutions of nonlinear complementarity problems. The validation procedure performs a computational test. If the result of the test is positive, then it is guaranteed that a given multi-dimensional interval either includes a solution or excludes all solutions of the nonlinear complementarity problem", "keyphrases": ["numerical validation", "computational test", "nonlinear complementarity problem", "optimization"], "prmu": ["P", "P", "P", "U"]} {"id": "17", "title": "Fault diagnosis and fault tolerant control of linear stochastic systems with unknown inputs", "abstract": "This paper presents an integrated robust fault detection and isolation (FDI) and fault tolerant control (FTC) scheme for a fault in actuators or sensors of linear stochastic systems subjected to unknown inputs (disturbances). As usual in this kind of works, it is assumed that single fault occurs at a time and the fault treated is of random bias type. The FDI module is constructed using banks of robust two-stage Kalman filters, which simultaneously estimate the state and the fault bias, and generate residual sets decoupled from unknown disturbances. All elements of residual sets are evaluated by using a hypothesis statistical test, and the fault is declared according to the prepared decision logic. The FTC module is activated based on the fault indicator, and additive compensation signal is computed using the fault bias estimate and combined to the nominal control law for compensating the fault's effect on the system. Simulation results for the simplified longitudinal flight control system with parameter variations, process and measurement noises demonstrate the effectiveness of the approach proposed", "keyphrases": ["fault detection", "fault isolation", "fault tolerant control", "linear systems", "stochastic systems", "two-stage Kalman filters", "state estimation", "longitudinal flight control system", "robust control", "discrete-time system"], "prmu": ["P", "R", "P", "R", "P", "P", "R", "P", "R", "M"]} -{"id": "1510", "title": "Estimation of the gradient of the solution of an adjoint diffusion equation by the Monte Carlo method", "abstract": "For the case of isotropic diffusion we consider the representation of the weighted concentration of trajectories and its space derivatives in the form of integrals (with some weights) of the solution to the corresponding boundary value problem and its directional derivative of a convective velocity. If the convective velocity at the domain boundary is degenerate and some other additional conditions are imposed this representation allows us to construct an efficient 'random walk by spheres and balls' algorithm. When these conditions are violated, transition to modelling the diffusion trajectories by the Euler scheme is realized, and the directional derivative of velocity is estimated by the dependent testing method, using the parallel modelling of two closely-spaced diffusion trajectories. We succeeded in justifying this method by statistically equivalent transition to modelling a single trajectory after the first step in the Euler scheme, using a suitable weight. This weight also admits direct differentiation with respect to the initial coordinate along a given direction. The resulting weight algorithm for calculating concentration derivatives is especially efficient if the initial point is in the subdomain in which the coefficients of the diffusion equation are constant", "keyphrases": ["isotropic diffusion", "weighted trajectory concentration", "space derivatives", "integrals", "boundary value problem", "directional derivative", "convective velocity", "domain boundary", "gradient estimation", "adjoint diffusion equation", "Monte Carlo method", "random walk by spheres and balls algorithm", "diffusion trajectories", "Euler scheme", "dependent testing method", "parallel modelling", "closely-spaced diffusion trajectories", "statistically equivalent transition", "weight", "direct differentiation", "initial coordinate", "concentration derivatives"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "R", "P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} +{"id": "1510", "title": "Estimation of the gradient of the solution of an adjoint diffusion equation by the Monte Carlo method", "abstract": "For the case of isotropic diffusion we consider the representation of the weighted concentration of trajectories and its space derivatives in the form of integrals (with some weights) of the solution to the corresponding boundary value problem and its directional derivative of a convective velocity. If the convective velocity at the domain boundary is degenerate and some other additional conditions are imposed this representation allows us to construct an efficient 'random walk by spheres and balls' algorithm. When these conditions are violated, transition to modelling the diffusion trajectories by the Euler scheme is realized, and the directional derivative of velocity is estimated by the dependent testing method, using the parallel modelling of two closely-spaced diffusion trajectories. We succeeded in justifying this method by statistically equivalent transition to modelling a single trajectory after the first step in the Euler scheme, using a suitable weight. This weight also admits direct differentiation with respect to the initial coordinate along a given direction. The resulting weight algorithm for calculating concentration derivatives is especially efficient if the initial point is in the subdomain in which the coefficients of the diffusion equation are constant", "keyphrases": ["isotropic diffusion", "weighted trajectory concentration", "space derivatives", "integrals", "boundary value problem", "directional derivative", "convective velocity", "domain boundary", "gradient estimation", "adjoint diffusion equation", "Monte Carlo method", "random walk by spheres and balls algorithm", "diffusion trajectories", "Euler scheme", "dependent testing method", "parallel modelling", "closely-spaced diffusion trajectories", "statistically equivalent transition", "weight", "direct differentiation", "initial coordinate", "concentration derivatives"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "R", "P", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1555", "title": "A note on multi-index polynomials of Dickson type and their applications in quantum optics", "abstract": "We discuss the properties of a new family of multi-index Lucas type polynomials, which are often encountered in problems of intracavity photon statistics. We develop an approach based on the integral representation method and show that this class of polynomials can be derived from recently introduced multi-index Hermite like polynomials", "keyphrases": ["Lucas type polynomials", "multi-index polynomials", "quantum optics", "intracavity photon statistics", "integral representation", "generating functions"], "prmu": ["P", "P", "P", "P", "P", "U"]} -{"id": "1694", "title": "Product development: using a 3D computer model to optimize the stability of the Rocket TM powered wheelchair", "abstract": "A three-dimensional (3D) lumped-parameter model of a powered wheelchair was created to aid the development of the Rocket prototype wheelchair and to help explore the effect of innovative design features on its stability. The model was developed using simulation software, specifically Working Model 3D. The accuracy of the model was determined by comparing both its static stability angles and dynamic behavior as it passed down a 4.8-cm (1.9\") road curb at a heading of 45 degrees with the performance of the actual wheelchair. The model's predictions of the static stability angles in the forward, rearward, and lateral directions were within 9.3, 7.1, and 3.8% of the measured values, respectively. The average absolute error in the predicted position of the wheelchair as it moved down the curb was 2.2 cm/m (0.9\" per 3'3\") traveled. The accuracy was limited by the inability to model soft bodies, the inherent difficulties in modeling a statically indeterminate system, and the computing time. Nevertheless, it was found to be useful in investigating the effect of eight design alterations on the lateral stability of the wheelchair. Stability was quantified by determining the static lateral stability angles and the maximum height of a road curb over which the wheelchair could successfully drive on a diagonal heading. The model predicted that the stability was more dependent on the configuration of the suspension system than on the dimensions and weight distribution of the wheelchair. Furthermore, for the situations and design alterations studied, predicted improvements in static stability were not correlated with improvements in dynamic stability", "keyphrases": ["3D computer model", "product development", "innovative design features", "suspension system configuration", "dynamic stability improvements", "average absolute error", "predicted position", "soft bodies modeling", "statically indeterminate system", "computing time", "design alterations effect", "diagonal heading", "weight distribution", "Rocket TM powered wheelchair", "4.8 cm"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "R", "P", "P", "R", "P", "P", "P", "U"]} +{"id": "1694", "title": "Product development: using a 3D computer model to optimize the stability of the Rocket TM powered wheelchair", "abstract": "A three-dimensional (3D) lumped-parameter model of a powered wheelchair was created to aid the development of the Rocket prototype wheelchair and to help explore the effect of innovative design features on its stability. The model was developed using simulation software, specifically Working Model 3D. The accuracy of the model was determined by comparing both its static stability angles and dynamic behavior as it passed down a 4.8-cm (1.9\") road curb at a heading of 45 degrees with the performance of the actual wheelchair. The model's predictions of the static stability angles in the forward, rearward, and lateral directions were within 9.3, 7.1, and 3.8% of the measured values, respectively. The average absolute error in the predicted position of the wheelchair as it moved down the curb was 2.2 cm/m (0.9\" per 3'3\") traveled. The accuracy was limited by the inability to model soft bodies, the inherent difficulties in modeling a statically indeterminate system, and the computing time. Nevertheless, it was found to be useful in investigating the effect of eight design alterations on the lateral stability of the wheelchair. Stability was quantified by determining the static lateral stability angles and the maximum height of a road curb over which the wheelchair could successfully drive on a diagonal heading. The model predicted that the stability was more dependent on the configuration of the suspension system than on the dimensions and weight distribution of the wheelchair. Furthermore, for the situations and design alterations studied, predicted improvements in static stability were not correlated with improvements in dynamic stability", "keyphrases": ["3D computer model", "product development", "innovative design features", "suspension system configuration", "dynamic stability improvements", "average absolute error", "predicted position", "soft bodies modeling", "statically indeterminate system", "computing time", "design alterations effect", "diagonal heading", "weight distribution", "Rocket TM powered wheelchair", "4.8 cm"], "prmu": ["P", "P", "P", "R", "R", "P", "P", "R", "P", "P", "R", "P", "P", "P", "M"]} {"id": "1568", "title": "Natural language from artificial life", "abstract": "This article aims to show that linguistics, in particular the study of the lexico-syntactic aspects of language, provides fertile ground for artificial life modeling. A survey of the models that have been developed over the last decade and a half is presented to demonstrate that ALife techniques have a lot to offer an explanatory theory of language. It is argued that this is because much of the structure of language is determined by the interaction of three complex adaptive systems: learning, culture, and biological evolution. Computational simulation, informed by theoretical linguistics, is an appropriate response to the challenge of explaining real linguistic data in terms of the processes that underpin human language", "keyphrases": ["natural language", "linguistics", "lexico-syntactic aspects", "ALife", "adaptive systems", "learning", "culture", "biological evolution", "computational simulation", "artificial life"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "178", "title": "A parallelized indexing method for large-scale case-based reasoning", "abstract": "Case-based reasoning (CBR) is a problem solving methodology commonly seen in artificial intelligence. It can correctly take advantage of the situations and methods in former cases to find out suitable solutions for new problems. CBR must accurately retrieve similar prior cases for getting a good performance. In the past, many researchers proposed useful technologies to handle this problem. However, the performance of retrieving similar cases may be greatly influenced by the number of cases. In this paper, the performance issue of large-scale CBR is discussed and a parallelized indexing architecture is then proposed for efficiently retrieving similar cases in large-scale CBR. Several algorithms for implementing the proposed architecture are also described. Some experiments are made and the results show the efficiency of the proposed method", "keyphrases": ["parallelized indexing method", "large-scale case-based reasoning", "problem solving methodology", "artificial intelligence", "bitwise indexing", "similar prior case retrieval", "performance", "experiments"], "prmu": ["P", "P", "P", "P", "M", "R", "P", "P"]} {"id": "185", "title": "Property testers for dense Constraint Satisfaction programs on finite domains", "abstract": "Many NP-hard languages can be \"decided\" in subexponential time if the definition of \"decide\" is relaxed only slightly. Rubinfeld and Sudan introduced the notion of property testers, probabilistic algorithms that can decide, with high probability, if a function has a certain property or if it is far from any function having this property. Goldreich, Goldwasser, and Ron constructed property testers with constant query complexity for dense instances of a large class of graph problems. Since many graph problems can be viewed as special cases of the Constraint Satisfaction Problem on Boolean domains, it is natural to try to construct property testers for more general cases of the Constraint Satisfaction Problem. In this paper, we give explicit constructions of property testers using a constant number of queries for dense instances of Constraint Satisfaction Problems where the constraints have constant arity and the variables assume values in some domain of finite size", "keyphrases": ["NP-hard languages", "property testers", "probabilistic algorithms", "constant query complexity", "constraint satisfaction", "dense instances", "randomized sampling", "subexponential time", "graph problems", "Constraint Satisfaction Problem"], "prmu": ["P", "P", "P", "P", "P", "P", "U", "P", "P", "P"]} {"id": "1907", "title": "Multiple comparison methods for means", "abstract": "Multiple comparison methods (MCMs) are used to investigate differences between pairs of population means or, more generally, between subsets of population means using sample data. Although several such methods are commonly available in statistical software packages, users may be poorly informed about the appropriate method(s) to use and/or the correct way to interpret the results. This paper classifies the MCMs and presents the important methods for each class. Both simulated and real data are used to compare the methods, and emphasis is placed on a correct application and interpretation. We include suggestions for choosing the best method. Mathematica programs developed by the authors are used to compare MCMs. By taking the advantage of Mathematica's notebook structure, all interested student can use these programs to explore the subject more deeply", "keyphrases": ["multiple comparison procedures", "population means", "error rate", "single-step procedures", "step-down procedures", "sales management", "pack-age design"], "prmu": ["M", "P", "U", "U", "U", "U", "U"]} {"id": "1595", "title": "Convergence of finite element approximations and multilevel linearization for Ginzburg-Landau model of d-wave superconductors", "abstract": "In this paper, we consider the finite element approximations of a recently proposed Ginzburg-Landau-type model for d-wave superconductors. In contrast to the conventional Ginzburg-Landau model the scalar complex valued order-parameter is replaced by a multicomponent complex order-parameter and the free energy is modified according to the d-wave paring symmetry. Convergence and optimal error estimates and some super-convergent estimates for the derivatives are derived. Furthermore, we propose a multilevel linearization procedure to solve the nonlinear systems. It is proved that the optimal error estimates and super-convergence for the derivatives are preserved by the multi-level linearization algorithm", "keyphrases": ["Ginzburg-Landau model", "d-wave", "superconductivity", "finite element method", "nonlinear systems", "error estimation", "two-grid method", "free energy", "multilevel linearization"], "prmu": ["P", "P", "U", "M", "P", "P", "U", "P", "P"]} {"id": "1669", "title": "Supply chain optimisation in the paper industry", "abstract": "We describe the formulation and development of a supply-chain optimisation model for Fletcher Challenge Paper Australasia (FCPA). This model, known as the paper industry value optimisation tool (PIVOT), is a large mixed integer program that finds an optimal allocation of supplier to mill, product to paper machine, and paper machine to customer, while at the same time modelling many of the supply chain details and nuances which are peculiar to FCPA. PIVOT has assisted FCPA in solving a number of strategic and tactical decision problems, and provided significant economic benefits for the company", "keyphrases": ["supply chain optimisation", "Fletcher Challenge Paper Australasia", "paper industry value optimisation tool", "PIVOT", "large mixed integer program", "optimal allocation", "strategic decision problems", "tactical decision problems", "economic benefits"], "prmu": ["P", "P", "P", "P", "P", "P", "R", "P", "P"]} -{"id": "1488", "title": "Social presence in telemedicine", "abstract": "We studied consultations between a doctor, emergency nurse practitioners (ENPs) and their patients in a minor accident and treatment service (MATS). In the conventional consultations, all three people were located at the main hospital. In the teleconsultations, the doctor was located in a hospital 6 km away from the MATS and used a videoconferencing link connected at 384 kbit/s. There were 30 patients in the conventional group and 30 in the telemedical group. The presenting problems were similar in the two groups. The mean duration of teleconsultations was 951 s and the mean duration of face-to-face consultations was 247 s. In doctor-nurse communication there was a higher rate of turn taking in teleconsultations than in face-to-face consultations; there were also more interruptions, more words and more `backchannels' (e.g. `mhm', `uh-huh') per teleconsultation. In doctor-patient communication there was a higher rate of turn taking, more words, more interruptions and more backchannels per teleconsultation. In patient-nurse communication there was. relatively little difference between the two modes of consulting the doctor. Telemedicine appeared to empower the patient to ask more questions of the doctor. It also seemed that the doctor took greater care in a teleconsultation to achieve coordination of beliefs with the patient than in a face-to-face consultation", "keyphrases": ["social presence", "telemedicine", "doctor", "emergency nurse practitioners", "patients", "minor accident and treatment service", "teleconsultations", "videoconferencing link", "face-to-face consultations", "doctor-nurse communication", "interruptions", "backchannels", "words", "turn taking", "patient-nurse communication", "belief coordination", "384 kbit/s", "951 s", "247 s"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "P", "P", "R"]} +{"id": "1488", "title": "Social presence in telemedicine", "abstract": "We studied consultations between a doctor, emergency nurse practitioners (ENPs) and their patients in a minor accident and treatment service (MATS). In the conventional consultations, all three people were located at the main hospital. In the teleconsultations, the doctor was located in a hospital 6 km away from the MATS and used a videoconferencing link connected at 384 kbit/s. There were 30 patients in the conventional group and 30 in the telemedical group. The presenting problems were similar in the two groups. The mean duration of teleconsultations was 951 s and the mean duration of face-to-face consultations was 247 s. In doctor-nurse communication there was a higher rate of turn taking in teleconsultations than in face-to-face consultations; there were also more interruptions, more words and more `backchannels' (e.g. `mhm', `uh-huh') per teleconsultation. In doctor-patient communication there was a higher rate of turn taking, more words, more interruptions and more backchannels per teleconsultation. In patient-nurse communication there was. relatively little difference between the two modes of consulting the doctor. Telemedicine appeared to empower the patient to ask more questions of the doctor. It also seemed that the doctor took greater care in a teleconsultation to achieve coordination of beliefs with the patient than in a face-to-face consultation", "keyphrases": ["social presence", "telemedicine", "doctor", "emergency nurse practitioners", "patients", "minor accident and treatment service", "teleconsultations", "videoconferencing link", "face-to-face consultations", "doctor-nurse communication", "interruptions", "backchannels", "words", "turn taking", "patient-nurse communication", "belief coordination", "384 kbit/s", "951 s", "247 s"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P", "R"]} {"id": "1774", "title": "A work journal [librarianship]", "abstract": "Keeping a work journal can be useful in exploring one's thoughts and feelings about work challenges and work decisions. It can help bring about greater fulfillment in one's work life by facilitating self-renewal, change, the search for new meaning, and job satisfaction. One example of a work journal which I kept in 1998 is considered. It touches on several issues of potential interest to midlife career librarians including the challenge of technology, returning to work at midlife after raising a family, further education, professional writing, and job exchange", "keyphrases": ["work decisions", "work challenges", "job satisfaction", "self-renewal", "work journal", "change", "midlife career librarians", "technology", "further education", "professional writing", "job exchange"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1731", "title": "Hit the road, Jack", "abstract": "Going freelance offers the potential of higher earnings, variety and independence - but also removes the benefits of permanent employment and can mean long distance travel and periods out of work. The author looks at the benefits and drawbacks - and how to get started as an IT contractor", "keyphrases": ["IT contractor", "freelance working"], "prmu": ["P", "R"]} {"id": "1789", "title": "Dousing terrorist funding: mission possible? [banks]", "abstract": "The government is tightening its grip on terrorist money flows. But as the banking industry continues to expand its Patriot Act compliance activities, it is with the realization that a great deal of work remains to be done before the American financial system can become truly airtight. Identification instruments, especially drivers licenses, represent a significant weak spot", "keyphrases": ["banking", "Patriot Act", "terrorist funding", "identification"], "prmu": ["P", "P", "P", "P"]} {"id": "1475", "title": "Relation between glare and driving performance", "abstract": "The present study investigated the effects of discomfort glare on driving behavior. Participants (old and young; US and Europeans) were exposed to a simulated low- beam light source mounted on the hood of an instrumented vehicle. Participants drove at night in actual traffic along a track consisting of urban, rural, and highway stretches. The results show that the relatively low glare source caused a significant drop in detecting simulated pedestrians along the roadside and made participants drive significantly slower on dark and winding roads. Older participants showed the largest drop in pedestrian detection performance and reduced their driving speed the most. The results indicate that the de Boer rating scale, the most commonly used rating scale for discomfort glare, is practically useless as a predictor of driving performance. Furthermore, the maximum US headlamp intensity (1380 cd per headlamp) appears to be an acceptable upper limit", "keyphrases": ["glare", "driving performance", "discomfort glare", "simulated low-beam light source", "road traffic", "urban road", "rural road", "highway", "deBoer rating scale"], "prmu": ["P", "P", "P", "M", "R", "R", "R", "P", "M"]} {"id": "1608", "title": "A geometric process equivalent model for a multistate degenerative system", "abstract": "In this paper, a monotone process model for a one-component degenerative system with k+1 states (k failure states and one working state) is studied. We show that this model is equivalent to a geometric process (GP) model for a two-state one component system such that both systems have the same long-run average cost per unit time and the same optimal policy. Furthermore, an explicit expression for the determination of an optimal policy is derived", "keyphrases": ["multistate degenerative system", "geometric process equivalent model", "monotone process model", "one-component degenerative system", "failure states", "working state", "two-state one component system", "long-run average cost", "optimal policy", "replacement policy", "renewal reward process"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "M", "M"]} {"id": "1923", "title": "Predictive control of a high temperature-short time pasteurisation process", "abstract": "Modifications on the dynamic matrix control (DMC) algorithm are presented to deal with transfer functions with varying parameters in order to control a high temperature-short time pasteurisation process. To control processes with first order with pure time delay models whose parameters present an exogenous variable dependence, a new method of free response calculation, using multiple model information, is developed. Two methods, to cope with those nonlinear models that allow a generalised Hammerstein model description, are proposed. The proposed methods have been tested, both in simulation and in real cases, in comparison with PID and DMC classic controllers, showing important improvements on reference tracking and disturbance rejection", "keyphrases": ["high temperature-short time pasteurisation process", "predictive control", "dynamic matrix control algorithm", "transfer functions", "first order processes", "time delay models", "exogenous variable dependence", "free response calculation", "multiple model information", "nonlinear models", "generalised Hammerstein model description", "reference tracking", "disturbance rejection"], "prmu": ["P", "P", "R", "P", "R", "P", "P", "P", "P", "P", "P", "P", "P"]} -{"id": "1509", "title": "Mathematical modelling of the work of the system of wells in a layer with the exponential law of permeability variation and the mobile liquid interface", "abstract": "We construct and study a two-dimensional model of the work of the system of wells in a layer with the mobile boundary between liquids of various viscosity. We use a 'plunger' displacement model of liquids. The boundaries of the filtration region of these liquids are modelled by curves of the Lyapunov class. Unlike familiar work, we solve two-dimensonal problems in an inhomogeneous layer when the mobile boundary and the boundaries of the filtration region are modelled by curves of the Lyapunov class. We show the practical convergence of the numerical solution of the problems studied", "keyphrases": ["2D model", "work", "well system", "mathematical modelling", "exponential law", "permeability variation", "mobile liquid interface", "mobile boundary", "viscosity", "plunger displacement model", "filtration region boundaries", "Lyapunov class curves", "inhomogeneous layer", "convergence", "numerical solution"], "prmu": ["M", "P", "R", "P", "P", "P", "P", "P", "P", "M", "R", "R", "P", "P", "P"]} +{"id": "1509", "title": "Mathematical modelling of the work of the system of wells in a layer with the exponential law of permeability variation and the mobile liquid interface", "abstract": "We construct and study a two-dimensional model of the work of the system of wells in a layer with the mobile boundary between liquids of various viscosity. We use a 'plunger' displacement model of liquids. The boundaries of the filtration region of these liquids are modelled by curves of the Lyapunov class. Unlike familiar work, we solve two-dimensonal problems in an inhomogeneous layer when the mobile boundary and the boundaries of the filtration region are modelled by curves of the Lyapunov class. We show the practical convergence of the numerical solution of the problems studied", "keyphrases": ["2D model", "work", "well system", "mathematical modelling", "exponential law", "permeability variation", "mobile liquid interface", "mobile boundary", "viscosity", "plunger displacement model", "filtration region boundaries", "Lyapunov class curves", "inhomogeneous layer", "convergence", "numerical solution"], "prmu": ["M", "P", "R", "P", "P", "P", "P", "P", "P", "R", "R", "R", "P", "P", "P"]} {"id": "1886", "title": "Non-asymptotic confidence ellipsoids for the least-squares estimate", "abstract": "We consider the finite sample properties of least-squares system identification, and derive non-asymptotic confidence ellipsoids for the estimate. The shape of the confidence ellipsoids is similar to the shape of the ellipsoids derived using asymptotic theory, but unlike asymptotic theory, they are valid for a finite number of data points. The probability that the estimate belongs to a certain ellipsoid has a natural dependence on the volume of the ellipsoid, the data generating mechanism, the model order and the number of data points available", "keyphrases": ["nonasymptotic confidence ellipsoids", "least-squares estimate", "finite sample properties", "least-squares system identification", "probability", "data generating mechanism", "model order", "data points"], "prmu": ["M", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1750", "title": "A dynamic method for weighted linear least squares problems", "abstract": "A new method for solving the weighted linear least squares problems with full rank is proposed. Based on the theory of Liapunov's stability, the method associates a dynamic system with a weighted linear least squares problem, whose solution we are interested in and integrates the former numerically by an A-stable numerical method. The numerical tests suggest that the new method is more than comparative with current conventional techniques based on the normal equations", "keyphrases": ["dynamic method", "weighted linear least squares problems", "Lyapunov stability", "A-stable numerical method"], "prmu": ["P", "P", "M", "P"]} {"id": "1715", "title": "Information-processing and computing systems at thermal power stations in China", "abstract": "The development and commissioning of information-processing and computing systems (IPCSs) at four power units, each of 500 MW capacity at the thermal power stations Tszisyan' and Imin' in China, are considered. The functional structure and the characteristics of the functions of the IPCSs are presented as is information on the technology of development and experience in adjustments. Ways of using the experience gained in creating a comprehensive functional firmware system are shown", "keyphrases": ["China", "thermal power stations", "information-processing systems", "computing systems", "commissioning", "development", "functional structure", "functions characteristics", "firmware system", "500 MW"], "prmu": ["P", "P", "R", "P", "P", "P", "P", "R", "P", "P"]} @@ -470,18 +470,18 @@ {"id": "1695", "title": "Medical image computing at the Institute of Mathematics and Computer Science in Medicine, University Hospital Hamburg-Eppendorf", "abstract": "The author reviews the history of medical image computing at his institute, summarizes the achievements, sketches some of the difficulties encountered, and draws conclusions that might be of interest especially to people new to the field. The origin and history section provides a chronology of this work, emphasizing the milestones reached during the past three decades. In accordance with the author's group's focus on imaging, the paper is accompanied by many pictures, some of which, he thinks, are of historical value", "keyphrases": ["Institute of Mathematics and Computer Science in Medicine", "University Hospital Hamburg-Eppendorf", "medical image computing history", "historical value", "difficulties encountered", "medical diagnostic imaging", "work chronology"], "prmu": ["P", "P", "R", "P", "P", "M", "R"]} {"id": "1569", "title": "An interactive self-replicator implemented in hardware", "abstract": "Self-replicating loops presented to date are essentially worlds unto themselves, inaccessible to the observer once the replication process is launched. We present the design of an interactive self-replicating loop of arbitrary size, wherein the user can physically control the loop's replication and induce its destruction. After introducing the BioWall, a reconfigurable electronic wall for bio-inspired applications, we describe the design of our novel loop and delineate its hardware implementation in the wall", "keyphrases": ["interactive self-replicator", "interactive self-replicating loop", "BioWall", "reconfigurable electronic wall", "bio-inspired applications", "hardware implementation", "self-replication", "field programmable gate array", "cellular automata", "reconfigurable computing", "artificial life"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "U", "U", "M", "U"]} {"id": "179", "title": "Document-based workflow modeling: a case-based reasoning approach", "abstract": "A workflow model is useful for business process analysis. A well-built workflow can help a company streamline its internal processes by reducing overhead. The results of workflow modeling need to be managed as information assets in a systematic fashion. Reusing these results is likely to enhance the quality of the modeling. Therefore, this paper proposes a document-based workflow modeling mechanism, which employs a case-based reasoning (CBR) technique for the effective reuse of design outputs. A repository is proposed to support this CBR process. A real-life case is illustrated to demonstrate the usefulness of our approach", "keyphrases": ["document-based workflow modeling", "case-based reasoning", "business process analysis", "company", "information assets", "design output reuse"], "prmu": ["P", "P", "P", "P", "P", "R"]} -{"id": "184", "title": "On the expected value of the minimum assignment", "abstract": "The minimum k-assignment of an m*n matrix X is the minimum sum of k entries of X, no two of which belong to the same row or column. Coppersmith and Sorkin conjectured that if X is generated by choosing each entry independently from the exponential distribution with mean 1, then the expected value of its minimum k-assignment is given by an explicit formula, which has been proven only in a few cases. In this paper we describe our efforts to prove the Coppersmith-Sorkin conjecture by considering the more general situation where the entries x/sub ij/ of X are chosen independently from different distributions. In particular, we require that x/sub ij/ be chosen from the exponential distribution with mean 1/r/sub i/c/sub j/. We conjecture an explicit formula for the expected value of the minimum k-assignment of such X and give evidence for this formula", "keyphrases": ["minimum k-assignment", "m * n matrix", "exponential distribution", "rational function", "bipartite graph"], "prmu": ["P", "P", "P", "U", "U"]} +{"id": "184", "title": "On the expected value of the minimum assignment", "abstract": "The minimum k-assignment of an m*n matrix X is the minimum sum of k entries of X, no two of which belong to the same row or column. Coppersmith and Sorkin conjectured that if X is generated by choosing each entry independently from the exponential distribution with mean 1, then the expected value of its minimum k-assignment is given by an explicit formula, which has been proven only in a few cases. In this paper we describe our efforts to prove the Coppersmith-Sorkin conjecture by considering the more general situation where the entries x/sub ij/ of X are chosen independently from different distributions. In particular, we require that x/sub ij/ be chosen from the exponential distribution with mean 1/r/sub i/c/sub j/. We conjecture an explicit formula for the expected value of the minimum k-assignment of such X and give evidence for this formula", "keyphrases": ["minimum k-assignment", "m * n matrix", "exponential distribution", "rational function", "bipartite graph"], "prmu": ["P", "M", "P", "U", "U"]} {"id": "1906", "title": "Integrated process control using an in situ sensor for etch", "abstract": "The migration to tighter geometries and more complex process sequence integration schemes requires having the ability to compensate for upstream deviations from target specifications. Doing so ensures that-downstream process sequences operate on work-in-progress that is well within control. Because point-of-use visibility of work-in-progress quality has become of paramount concern in the industry's drive to reduce scrap and improve yield, controlling trench depth has assumed greater importance. An integrated, interferometric based, rate monitor for etch-to-depth and spacer etch applications has been developed for controlling this parameter. This article demonstrates that the integrated rate monitor, using polarization and digital signal processing, enhances control etch-to-depth processes and can also be implemented as a predictive endpoint in a wafer manufacturing environment for dual damascene trench etch and spacer etch applications", "keyphrases": ["interferometric in situ etch sensor", "integrated process control", "polarization", "digital signal processing", "wafer manufacturing environment", "process predictive endpoint", "dual damascene trench etch", "spacer etch applications", "IC geometry", "complex process sequence integration schemes", "upstream deviation compensation", "target specifications", "downstream process sequences", "point-of-use visibility", "work-in-progress quality", "scrap reduction", "yield improvement", "trench depth control", "interferometry", "integrated etch rate monitor"], "prmu": ["R", "P", "P", "P", "P", "R", "P", "P", "M", "P", "R", "P", "M", "P", "P", "M", "R", "R", "U", "R"]} {"id": "1594", "title": "Training multilayer perceptrons via minimization of sum of ridge functions", "abstract": "Motivated by the problem of training multilayer perceptrons in neural networks, we consider the problem of minimizing E(x)= Sigma /sub i=1//sup n/ f/sub i/( xi /sub i/.x), where xi /sub i/ in R/sup S/, 1or= 0} are investigated, where || . ||/sub p/ is the usual vector norm in C/sup n/ resp. R/sup n/, for p epsilon [1, o infinity ]. Moreover, formulae for the first three right derivatives D/sub +//sup k/||s(t)||/sub p/, k = 1, 2,3 are determined. These formulae are applied to vibration problems by computing the best upper bounds on ||s(t)||/sub p/ in certain classes of bounds. These results cannot be obtained by the methods used so far. The systematic use of the differential calculus for vector norms, as done here for the first time, could lead to major advances also in other branches of mathematics and other sciences", "keyphrases": ["differential calculus", "vector functions", "mapping", "vibration problems", "vector norms"], "prmu": ["P", "P", "P", "P", "P"]} {"id": "1511", "title": "Efficient algorithms for stiff elliptic problems with large parameters", "abstract": "We consider a finite element approximation and iteration algorithms for solving stiff elliptic boundary value problems with large parameters in front of a higher derivative. The convergence rate of the algorithms is independent of the spread in coefficients and a discretization parameter", "keyphrases": ["finite element approximation", "iteration algorithms", "stiff elliptic boundary value problems", "large parameters", "higher derivative", "efficient algorithms", "convergence rate"], "prmu": ["P", "P", "P", "P", "P", "P", "P"]} {"id": "1863", "title": "Information systems project failure: a comparative study of two countries", "abstract": "Many organizations, regardless of size, engage in at least one, and often many information system projects each year. Many of these projects consume massive amounts of resources, and may cost as little as a few thousand dollars to ten, and even hundreds of millions of dollars. Needless to say, the investment of time and resources into these ventures are of significant concern to chief information officers (CIOs), executives staff members, project managers, and others in leadership positions. This paper describes the results of a survey performed between Australia and the United States regarding factors leading to IS project failure. The findings suggest that, among other things, end user involvement and executive management leadership are key indicators influencing IS project failure", "keyphrases": ["information systems project failure", "Australia", "United States", "end user involvement", "executive management leadership"], "prmu": ["P", "P", "P", "P", "P"]} {"id": "1826", "title": "Modeling shape and topology of low-resolution density maps of biological macromolecules", "abstract": "We develop an efficient way of representing the geometry and topology of volumetric datasets of biological structures from medium to low resolution, aiming at storing and querying them in a database framework. We make use of a new vector quantization algorithm to select the points within the macromolecule that best approximate the probability density function of the original volume data. Connectivity among points is obtained with the use of the alpha shapes theory. This novel data representation has a number of interesting characteristics, such as (1) it allows us to automatically segment and quantify a number of important structural features from low-resolution maps, such as cavities and channels, opening the possibility of querying large collections of maps on the basis of these quantitative structural features; (2) it provides a compact representation in terms of size; (3) it contains a subset of three-dimensional points that optimally quantify the densities of medium resolution data; and (4) a general model of the geometry and topology of the macromolecule (as opposite to a spatially unrelated bunch of voxels) is easily obtained by the use of the alpha shapes theory", "keyphrases": ["geometry", "topology", "volumetric datasets", "biological structures", "database framework", "vector quantization algorithm", "low-resolution density maps", "biological macromolecules", "modeling", "probability density function", "data representation", "structural features", "cavities", "channels", "connectivity", "compact representation", "three-dimensional points", "medium resolution data", "general model", "original volume data", "alpha shapes theory"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]} -{"id": "1748", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. I. Lamina theory", "abstract": "It is well known that a structural design with isotropic materials can only be accomplished based on a stress failure criterion. This is, however, generally not true with laminated composites. Only when the laminate is subjected to an in-plane load, can the ultimate failure of the laminate correspond to its last-ply failure, and hence a stress failure criterion may be sufficient to detect the maximum load that can be sustained by the laminate. Even in such a case, the load shared by each lamina in the laminate cannot be correctly determined if the lamina instantaneous stiffness matrix is inaccurately provided, since the lamina is always statically indeterminate in the laminate. If, however, the laminate is subjected to a lateral load, its ultimate failure occurs before last-ply failure and use of the stress failure criterion is no longer sufficient; an additional critical deflection or curvature condition must also be employed. This necessitates development of an efficient constitutive relationship for laminated composites in order that the laminate strains/deflections up to ultimate failure can be accurately calculated. A general constitutive description for the thermomechanical response of a fibrous laminate up to ultimate failure with applications to various fibrous laminates is presented in the two papers. The constitutive relationship is obtained by combining classical lamination theory with a recently developed bridging micromechanics model, through a layer-by-layer analysis. This paper focuses on lamina analysis", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "lamina theory", "structural design", "isotropic materials", "stress failure criterion", "in-plane load", "instantaneous stiffness matrix", "lateral load", "last-ply failure", "critical deflection condition", "critical curvature condition", "composites", "laminate strains", "laminate deflections", "thermomechanical response", "layer-by-layer analysis", "micromechanics model", "multidirectional tape laminae", "woven fabric composites", "braided fabric composites", "knitted fabric reinforced composites", "elastoplasticity", "elastic-viscoplasticity"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P", "M", "R", "P", "P", "P", "M", "M", "M", "M", "U", "U"]} +{"id": "1748", "title": "On a general constitutive description for the inelastic and failure behavior of fibrous laminates. I. Lamina theory", "abstract": "It is well known that a structural design with isotropic materials can only be accomplished based on a stress failure criterion. This is, however, generally not true with laminated composites. Only when the laminate is subjected to an in-plane load, can the ultimate failure of the laminate correspond to its last-ply failure, and hence a stress failure criterion may be sufficient to detect the maximum load that can be sustained by the laminate. Even in such a case, the load shared by each lamina in the laminate cannot be correctly determined if the lamina instantaneous stiffness matrix is inaccurately provided, since the lamina is always statically indeterminate in the laminate. If, however, the laminate is subjected to a lateral load, its ultimate failure occurs before last-ply failure and use of the stress failure criterion is no longer sufficient; an additional critical deflection or curvature condition must also be employed. This necessitates development of an efficient constitutive relationship for laminated composites in order that the laminate strains/deflections up to ultimate failure can be accurately calculated. A general constitutive description for the thermomechanical response of a fibrous laminate up to ultimate failure with applications to various fibrous laminates is presented in the two papers. The constitutive relationship is obtained by combining classical lamination theory with a recently developed bridging micromechanics model, through a layer-by-layer analysis. This paper focuses on lamina analysis", "keyphrases": ["general constitutive description", "inelastic behavior", "failure behavior", "fibrous laminates", "lamina theory", "structural design", "isotropic materials", "stress failure criterion", "in-plane load", "instantaneous stiffness matrix", "lateral load", "last-ply failure", "critical deflection condition", "critical curvature condition", "composites", "laminate strains", "laminate deflections", "thermomechanical response", "layer-by-layer analysis", "micromechanics model", "multidirectional tape laminae", "woven fabric composites", "braided fabric composites", "knitted fabric reinforced composites", "elastoplasticity", "elastic-viscoplasticity"], "prmu": ["P", "R", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "R", "R", "P", "P", "R", "P", "P", "P", "M", "M", "M", "M", "U", "U"]} {"id": "1570", "title": "Self-reproduction in three-dimensional reversible cellular space", "abstract": "Due to inevitable power dissipation, it is said that nano-scaled computing devices should perform their computing processes in a reversible manner. This will be a large problem in constructing three-dimensional nano-scaled functional objects. Reversible cellular automata (RCA) are used for modeling physical phenomena such as power dissipation, by studying the dissipation of garbage signals. We construct a three-dimensional self-inspective self-reproducing reversible cellular automaton by extending the two-dimensional version SR/sub 8/. It can self-reproduce various patterns in three-dimensional reversible cellular space without dissipating garbage signals", "keyphrases": ["self-reproduction", "nano-scaled computing devices", "power dissipation", "3D self-inspective self-reproducing cellular automata", "reversible cellular automata", "artificial life", "three-dimensional reversible cellular space"], "prmu": ["P", "P", "P", "M", "P", "U", "P"]} {"id": "1535", "title": "Hot controllers", "abstract": "Over the last few years, the semiconductor industry has put much emphasis on ways to improve the accuracy of thermal mass flow controllers (TMFCs). Although issues involving TMFC mounting orientation and pressure effects have received much attention, little has been done to address the effect of changes in ambient temperature or process gas temperature. Scientists and engineers at Qualiflow have succeeded to solve the problem using a temperature correction algorithm for digital TMFCs. Using an in situ environmental temperature compensation technique, we calculated correction factors for the temperature effect and obtained satisfactory results with both the traditional sensor and the new, improved thin-film sensors", "keyphrases": ["semiconductor manufacturing", "process gas flow", "thermal mass flow controller", "temperature correction algorithm", "in situ environmental temperature compensation"], "prmu": ["M", "R", "P", "P", "P"]} {"id": "160", "title": "Taming the paper tiger [paperwork organization]", "abstract": "Generally acknowledged as a critical problem for many information professionals, the massive flow of documents, paper trails, and information needs efficient and dependable approaches for processing and storing and finding items and information", "keyphrases": ["paperwork organization", "information professionals", "information processing", "information storage", "information retrieval"], "prmu": ["P", "P", "R", "M", "M"]} @@ -490,7 +490,7 @@ {"id": "1729", "title": "Maintaining e-commerce", "abstract": "E-commerce over the Web has created a relatively new type of information system. So it is hardly surprising that little attention has been given to the maintenance of such systems-and even less to attempting to develop them with future maintenance in mind. But there are various ways e-commerce systems can be developed to reduce future maintenance", "keyphrases": ["e-commerce systems maintenance", "Web systems"], "prmu": ["R", "R"]} {"id": "1847", "title": "Conceptual modeling and specification generation for B2B business processes based on ebXML", "abstract": "In order to support dynamic setup of business processes among independent organizations, a formal standard schema for describing the business processes is basically required. The ebXML framework provides such a specification schema called BPSS (Business Process Specification Schema) which is available in two standalone representations: a UML version, and an XML version. The former, however, is not intended for the direct creation of business process specifications, but for defining specification elements and their relationships required for creating an ebXML-compliant business process specification. For this reason, it is very important to support conceptual modeling that is well organized and directly matched with major modeling concepts. This paper deals with how to represent and manage B2B business processes using UML-compliant diagrams. The major challenge is to organize UML diagrams in a natural way that is well suited to the business process meta-model and then to transform the diagrams into an XML version. This paper demonstrates the usefulness of conceptually modeling business processes by prototyping a business process editor tool called ebDesigner", "keyphrases": ["B2B business processes", "ebXML", "conceptual modeling", "specification generation", "formal standard schema", "Business Process Specification Schema", "UML-compliant diagrams", "meta model", "ebDesigner", "business process editor"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "M", "P", "P"]} {"id": "1802", "title": "Novel TCP congestion control scheme and its performance evaluation", "abstract": "A novel self-tuning proportional and derivative (ST-PD) control based TCP congestion control scheme is proposed. The new scheme approaches the congestion control problem from a control-theoretical perspective and overcomes several Important limitations associated with existing TCP congestion control schemes, which are heuristic based. In the proposed scheme, a PD controller is employed to keep the buffer occupancy of the bottleneck node on the connection path at an ideal operating level, and it adjusts the TCP window accordingly. The control gains of the PD controller are tuned online by a fuzzy logic controller based on the perceived bandwidth-delay product of the TCP connection. This scheme gives ST-PD TCP several advantages over current TCP implementations. These include rapid response to bandwidth variations, insensitivity to buffer sizes, and significant improvement of TCP throughput over lossy links by decoupling congestion control and error control functions of TCP", "keyphrases": ["TCP congestion control scheme", "performance evaluation", "self-tuning proportional-derivative control", "control-theoretical perspective", "PD controller", "buffer occupancy", "bottleneck node", "connection path", "fuzzy logic controller", "bandwidth-delay product", "lossy links"], "prmu": ["P", "P", "M", "P", "P", "P", "P", "P", "P", "P", "P"]} -{"id": "1490", "title": "Client satisfaction in a feasibility study comparing face-to-face interviews with telepsychiatry", "abstract": "We carried out a pilot study comparing satisfaction levels between psychiatric patients seen face to face (FTF) and those seen via videoconference. Patients who consented were randomly assigned to one of two groups. One group received services in person (FTF from the visiting psychiatrist) while the other was seen using videoconferencing at 128 kbit/s. One psychiatrist provided all the FTF and videoconferencing assessment and follow-up visits. A total of 24 subjects were recruited. Three of the subjects (13%) did not attend their appointments and two subjects in each group were lost to follow-up. Thus there were nine in the FTF group and eight in the videoconferencing group. The two groups were similar in most respects. Patient satisfaction with the services was assessed using the Client Satisfaction Questionnaire (CSQ-8), completed four months after the initial consultation. The mean scores were 25.3 in the FTF group and 21.6 in the videoconferencing group. Although there was a trend in favour of the FTF service, the difference was not significant. Patient satisfaction is only one component of evaluation. The efficacy of telepsychiatry must also be measured relative to that of conventional, FTF care before policy makers can decide how extensively telepsychiatry should be implemented", "keyphrases": ["client satisfaction", "face-to-face interviews", "telepsychiatry", "psychiatric patient satisfaction", "human factors", "videoconference", "Client Satisfaction Questionnaire", "telemedicine", "128 kbit/s"], "prmu": ["P", "P", "P", "R", "U", "P", "P", "U", "P"]} +{"id": "1490", "title": "Client satisfaction in a feasibility study comparing face-to-face interviews with telepsychiatry", "abstract": "We carried out a pilot study comparing satisfaction levels between psychiatric patients seen face to face (FTF) and those seen via videoconference. Patients who consented were randomly assigned to one of two groups. One group received services in person (FTF from the visiting psychiatrist) while the other was seen using videoconferencing at 128 kbit/s. One psychiatrist provided all the FTF and videoconferencing assessment and follow-up visits. A total of 24 subjects were recruited. Three of the subjects (13%) did not attend their appointments and two subjects in each group were lost to follow-up. Thus there were nine in the FTF group and eight in the videoconferencing group. The two groups were similar in most respects. Patient satisfaction with the services was assessed using the Client Satisfaction Questionnaire (CSQ-8), completed four months after the initial consultation. The mean scores were 25.3 in the FTF group and 21.6 in the videoconferencing group. Although there was a trend in favour of the FTF service, the difference was not significant. Patient satisfaction is only one component of evaluation. The efficacy of telepsychiatry must also be measured relative to that of conventional, FTF care before policy makers can decide how extensively telepsychiatry should be implemented", "keyphrases": ["client satisfaction", "face-to-face interviews", "telepsychiatry", "psychiatric patient satisfaction", "human factors", "videoconference", "Client Satisfaction Questionnaire", "telemedicine", "128 kbit/s"], "prmu": ["P", "P", "P", "R", "U", "P", "P", "U", "M"]} {"id": "1791", "title": "The pedagogy of on-line learning: a report from the University of the Highlands and Islands Millennium Institute", "abstract": "Authoritative sources concerned with computer-aided learning, resource-based learning and on-line learning and teaching are generally agreed that, in addition to subject matter expertise and technical support, the quality of the learning materials and the learning experiences of students are critically dependent on the application of pedagogically sound theories of learning and teaching and principles of course design. The University of the Highlands and Islands Project (UHIMI) is developing \"on-line learning\" on a large scale. These developments have been accompanied by a comprehensive programme of staff development. A major emphasis of the programme is concerned with ensuring that course developers and tutors are pedagogically aware. This paper reviews (i) what is meant by \"on-line learning\" in the UHIMI context (ii) the theories of learning and teaching and principles of course design that inform the staff development programme and (iii) a review of progress to date", "keyphrases": ["online learning", "pedagogy", "computer-aided learning", "resource-based learning", "teaching", "technical support", "educational course design", "distance education", "Internet", "University of the Highlands and Islands Project", "staff development"], "prmu": ["M", "P", "P", "P", "P", "P", "M", "U", "U", "P", "P"]} {"id": "1887", "title": "Doubly invariant equilibria of linear discrete-time games", "abstract": "The notion of doubly invariant (DI) equilibrium is introduced. The concept extends controlled and robustly controlled invariance notions to the context of two-person dynamic games. Each player tries to keep the state in a region of state space independently of the actions of the rival player. The paper gives existence conditions, criteria and algorithms for the determination of DI equilibria of linear dynamic games in discrete time. Two examples illustrate the results. The first one is in the area of fault-tolerant controller synthesis. The second is an application to macroeconomics", "keyphrases": ["doubly invariant equilibria", "linear discrete-time games", "robustly controlled invariance", "two-person dynamic games", "state space", "existence conditions", "fault-tolerant controller synthesis", "macroeconomics"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P"]} {"id": "1714", "title": "Hordes: a multicast based protocol for anonymity", "abstract": "With widespread acceptance of the Internet as a public medium for communication and information retrieval, there has been rising concern that the personal privacy of users can be eroded by cooperating network entities. A technical solution to maintaining privacy is to provide anonymity. We present a protocol for initiator anonymity called Hordes, which uses forwarding mechanisms similar to those used in previous protocols for sending data, but is the first protocol to make use of multicast routing to anonymously receive data. We show this results in shorter transmission latencies and requires less work of the protocol participants, in terms of the messages processed. We also present a comparison of the security and anonymity of Hordes with previous protocols, using the first quantitative definition of anonymity and unlinkability. Our analysis shows that Hordes provides anonymity in a degree similar to that of Crowds and Onion Routing, but also that Hordes has numerous performance advantages", "keyphrases": ["Hordes", "protocol", "Internet", "personal privacy", "cooperating network entities", "initiator anonymity", "forwarding mechanisms", "multicast routing", "transmission latencies", "unlinkability", "Crowds", "Onion Routing", "performance"], "prmu": ["P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P", "P"]}