text stringlengths 0 4.09k |
|---|
Abstract: The joint cumulative distribution function for order statistics arising from several different populations is given in terms of the distribution function of the populations. The computational cost of the formula in the case of two populations is still exponential in the worst case, but it is a dramatic improvement compared to the general formula by Bapat and Beg. In the case when only the joint distribution function of a subset of the order statistics of fixed size is needed, the complexity is polynomial, for the case of two populations. |
Title: Efficient independent component analysis |
Abstract: Independent component analysis (ICA) has been widely used for blind source separation in many fields such as brain imaging analysis, signal processing and telecommunication. Many statistical techniques based on M-estimates have been proposed for estimating the mixing matrix. Recently, several nonparametric methods have been developed, but in-depth analysis of asymptotic efficiency has not been available. We analyze ICA using semiparametric theories and propose a straightforward estimate based on the efficient score function by using B-spline approximations. The estimate is asymptotically efficient under moderate conditions and exhibits better performance than standard ICA methods in a variety of simulations. |
Title: Truecluster matching |
Abstract: Cluster matching by permuting cluster labels is important in many clustering contexts such as cluster validation and cluster ensemble techniques. The classic approach is to minimize the euclidean distance between two cluster solutions which induces inappropriate stability in certain settings. Therefore, we present the truematch algorithm that introduces two improvements best explained in the crisp case. First, instead of maximizing the trace of the cluster crosstable, we propose to maximize a chi-square transformation of this crosstable. Thus, the trace will not be dominated by the cells with the largest counts but by the cells with the most non-random observations, taking into account the marginals. Second, we suggest a probabilistic component in order to break ties and to make the matching algorithm truly random on random data. The truematch algorithm is designed as a building block of the truecluster framework and scales in polynomial time. First simulation results confirm that the truematch algorithm gives more consistent truecluster results for unequal cluster sizes. Free R software is available. |
Title: Network tomography based on 1-D projections |
Abstract: Network tomography has been regarded as one of the most promising methodologies for performance evaluation and diagnosis of the massive and decentralized Internet. This paper proposes a new estimation approach for solving a class of inverse problems in network tomography, based on marginal distributions of a sequence of one-dimensional linear projections of the observed data. We give a general identifiability result for the proposed method and study the design issue of these one dimensional projections in terms of statistical efficiency. We show that for a simple Gaussian tomography model, there is an optimal set of one-dimensional projections such that the estimator obtained from these projections is asymptotically as efficient as the maximum likelihood estimator based on the joint distribution of the observed data. For practical applications, we carry out simulation studies of the proposed method for two instances of network tomography. The first is for traffic demand tomography using a Gaussian Origin-Destination traffic model with a power relation between its mean and variance, and the second is for network delay tomography where the link delays are to be estimated from the end-to-end path delays. We compare estimators obtained from our method and that obtained from using the joint distribution and other lower dimensional projections, and show that in both cases, the proposed method yields satisfactory results. |
Title: Mixed membership stochastic blockmodels |
Abstract: Observations consisting of measurements on relationships for pairs of objects arise in many settings, such as protein interaction and gene regulatory networks, collections of author-recipient email, and social networks. Analyzing such data with probabilisic models can be delicate because the simple exchangeability assumptions underlying many boilerplate models no longer hold. In this paper, we describe a latent variable model of such data called the mixed membership stochastic blockmodel. This model extends blockmodels for relational data to ones which capture mixed membership latent relational structure, thus providing an object-specific low-dimensional representation. We develop a general variational inference algorithm for fast approximate posterior inference. We explore applications to social and protein interaction networks. |
Title: Loop corrections for message passing algorithms in continuous variable models |
Abstract: In this paper we derive the equations for Loop Corrected Belief Propagation on a continuous variable Gaussian model. Using the exactness of the averages for belief propagation for Gaussian models, a different way of obtaining the covariances is found, based on Belief Propagation on cavity graphs. We discuss the relation of this loop correction algorithm to Expectation Propagation algorithms for the case in which the model is no longer Gaussian, but slightly perturbed by nonlinear terms. |
Title: Modeling Epidemic Spread in Synthetic Populations - Virtual Plagues in Massively Multiplayer Online Games |
Abstract: A virtual plague is a process in which a behavior-affecting property spreads among characters in a Massively Multiplayer Online Game (MMOG). The MMOG individuals constitute a synthetic population, and the game can be seen as a form of interactive executable model for studying disease spread, albeit of a very special kind. To a game developer maintaining an MMOG, recognizing, monitoring, and ultimately controlling a virtual plague is important, regardless of how it was initiated. The prospect of using tools, methods and theory from the field of epidemiology to do this seems natural and appealing. We will address the feasibility of such a prospect, first by considering some basic measures used in epidemiology, then by pointing out the differences between real world epidemics and virtual plagues. We also suggest directions for MMOG developer control through epidemiological modeling. Our aim is understanding the properties of virtual plagues, rather than trying to eliminate them or mitigate their effects, as would be in the case of real infectious disease. |
Title: Variable Selection Incorporating Prior Constraint Information into Lasso |
Abstract: We propose the variable selection procedure incorporating prior constraint information into lasso. The proposed procedure combines the sample and prior information, and selects significant variables for responses in a narrower region where the true parameters lie. It increases the efficiency to choose the true model correctly. The proposed procedure can be executed by many constrained quadratic programming methods and the initial estimator can be found by least square or Monte Carlo method. The proposed procedure also enjoys good theoretical properties. Moreover, the proposed procedure is not only used for linear models but also can be used for generalized linear models(\sl GLM), Cox models, quantile regression models and many others with the help of Wang and Leng (2007)'s LSA, which changes these models as the approximation of linear models. The idea of combining sample and prior constraint information can be also used for other modified lasso procedures. Some examples are used for illustration of the idea of incorporating prior constraint information in variable selection procedures. |
Title: Local Area Damage Detection in Composite Structures Using Piezoelectric Transducers |
Abstract: An integrated and automated smart structures approach for structural health monitoring is presented, utilizing an array of piezoelectric transducers attached to or embedded within the structure for both actuation and sensing. The system actively interrogates the structure via broadband excitation of multiple actuators across a desired frequency range. The structure's vibration signature is then characterized by computing the transfer functions between each actuator/sensor pair, and compared to the baseline signature. Experimental results applying the system to local area damage detection in a MD Explorer rotorcraft composite flexbeam are presented. |
Title: Recursive n-gram hashing is pairwise independent, at best |
Abstract: Many applications use sequences of n consecutive symbols (n-grams). Hashing these n-grams can be a performance bottleneck. For more speed, recursive hash families compute hash values by updating previous values. We prove that recursive hash families cannot be more than pairwise independent. While hashing by irreducible polynomials is pairwise independent, our implementations either run in time O(n) or use an exponential amount of memory. As a more scalable alternative, we make hashing by cyclic polynomials pairwise independent by ignoring n-1 bits. Experimentally, we show that hashing by cyclic polynomials is is twice as fast as hashing by irreducible polynomials. We also show that randomized Karp-Rabin hash families are not pairwise independent. |
Title: Modeling Computations in a Semantic Network |
Abstract: Semantic network research has seen a resurgence from its early history in the cognitive sciences with the inception of the Semantic Web initiative. The Semantic Web effort has brought forth an array of technologies that support the encoding, storage, and querying of the semantic network data structure at the world stage. Currently, the popular conception of the Semantic Web is that of a data modeling medium where real and conceptual entities are related in semantically meaningful ways. However, new models have emerged that explicitly encode procedural information within the semantic network substrate. With these new technologies, the Semantic Web has evolved from a data modeling medium to a computational medium. This article provides a classification of existing computational modeling efforts and the requirements of supporting technologies that will aid in the further growth of this burgeoning domain. |
Title: The M-estimator in a multi-phase random nonlinear model |
Abstract: This paper considers M-estimation of a nonlinear regression model with multiple change-points occuring at unknown times. The multi-phase random design regression model, discontinuous in each change-point, have an arbitrary error $\epsilon$. In the case when the number of jumps is known, the M-estimator of locations of breaks and of regression parameters are studied. These estimators are consistent and the distribution of the regression parameter estimators is Gaussian. The estimator of each change-point converges, with the rate $n^-1$, to the smallest minimizer of the independent compound Poisson processes. The results are valid for a large class of error distributions. |
Title: Automatic Detection of Pulmonary Embolism using Computational Intelligence |
Abstract: This article describes the implementation of a system designed to automatically detect the presence of pulmonary embolism in lung scans. These images are firstly segmented, before alignment and feature extraction using PCA. The neural network was trained using the Hybrid Monte Carlo method, resulting in a committee of 250 neural networks and good results are obtained. |
Title: Challenges and Opportunities of Evolutionary Robotics |
Abstract: Robotic hardware designs are becoming more complex as the variety and number of on-board sensors increase and as greater computational power is provided in ever-smaller packages on-board robots. These advances in hardware, however, do not automatically translate into better software for controlling complex robots. Evolutionary techniques hold the potential to solve many difficult problems in robotics which defy simple conventional approaches, but present many challenges as well. Numerous disciplines including artificial life, cognitive science and neural networks, rule-based systems, behavior-based control, genetic algorithms and other forms of evolutionary computation have contributed to shaping the current state of evolutionary robotics. This paper provides an overview of developments in the emerging field of evolutionary robotics, and discusses some of the opportunities and challenges which currently face practitioners in the field. |
Title: Virtual Sensor Based Fault Detection and Classification on a Plasma Etch Reactor |
Abstract: The SEMATECH sponsored J-88-E project teaming Texas Instruments with NeuroDyne (et al.) focused on Fault Detection and Classification (FDC) on a Lam 9600 aluminum plasma etch reactor, used in the process of semiconductor fabrication. Fault classification was accomplished by implementing a series of virtual sensor models which used data from real sensors (Lam Station sensors, Optical Emission Spectroscopy, and RF Monitoring) to predict recipe setpoints and wafer state characteristics. Fault detection and classification were performed by comparing predicted recipe and wafer state values with expected values. Models utilized include linear PLS, Polynomial PLS, and Neural Network PLS. Prediction of recipe setpoints based upon sensor data provides a capability for cross-checking that the machine is maintaining the desired setpoints. Wafer state characteristics such as Line Width Reduction and Remaining Oxide were estimated on-line using these same process sensors (Lam, OES, RFM). Wafer-to-wafer measurement of these characteristics in a production setting (where typically this information may be only sparsely available, if at all, after batch processing runs with numerous wafers have been completed) would provide important information to the operator that the process is or is not producing wafers within acceptable bounds of product quality. Production yield is increased, and correspondingly per unit cost is reduced, by providing the operator with the opportunity to adjust the process or machine before etching more wafers. |
Title: Compressed Regression |
Abstract: Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. This line of work shows that $\ell_1$-regularized least squares regression can accurately estimate a sparse linear model from $n$ noisy examples in $p$ dimensions, even if $p$ is much larger than $n$. In this paper we study a variant of this problem where the original $n$ input variables are compressed by a random linear transformation to $m \ll n$ examples in $p$ dimensions, and establish conditions under which a sparse linear model can be successfully recovered from the compressed data. A primary motivation for this compression procedure is to anonymize the data and preserve privacy by revealing little information about the original data. We characterize the number of random projections that are required for $\ell_1$-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called ``sparsistence.'' In addition, we show that $\ell_1$-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called ``persistence.'' Finally, we characterize the privacy properties of the compression procedure in information-theoretic terms, establishing upper bounds on the mutual information between the compressed and uncompressed data that decay to zero. |
Title: A Novel Model of Working Set Selection for SMO Decomposition Methods |
Abstract: In the process of training Support Vector Machines (SVMs) by decomposition methods, working set selection is an important technique, and some exciting schemes were employed into this field. To improve working set selection, we propose a new model for working set selection in sequential minimal optimization (SMO) decomposition methods. In this model, it selects B as working set without reselection. Some properties are given by simple proof, and experiments demonstrate that the proposed method is in general faster than existing methods. |
Title: Construction of Bayesian Deformable Models via Stochastic Approximation Algorithm: A Convergence Study |
Abstract: The problem of the definition and the estimation of generative models based on deformable templates from raw data is of particular importance for modelling non aligned data affected by various types of geometrical variability. This is especially true in shape modelling in the computer vision community or in probabilistic atlas building for Computational Anatomy (CA). A first coherent statistical framework modelling the geometrical variability as hidden variables has been given by Allassonni\`ere, Amit and Trouv\'e (JRSS 2006). Setting the problem in a Bayesian context they proved the consistency of the MAP estimator and provided a simple iterative deterministic algorithm with an EM flavour leading to some reasonable approximations of the MAP estimator under low noise conditions. In this paper we present a stochastic algorithm for approximating the MAP estimator in the spirit of the SAEM algorithm. We prove its convergence to a critical point of the observed likelihood with an illustration on images of handwritten digits. |
Title: Epistemic Analysis of Strategic Games with Arbitrary Strategy Sets |
Abstract: We provide here an epistemic analysis of arbitrary strategic games based on the possibility correspondences. Such an analysis calls for the use of transfinite iterations of the corresponding operators. Our approach is based on Tarski's Fixpoint Theorem and applies both to the notions of rationalizability and the iterated elimination of strictly dominated strategies. |
Title: Design, Implementation, and Cooperative Coevolution of an Autonomous/ Teleoperated Control System for a Serpentine Robotic Manipulator |
Abstract: Design, implementation, and machine learning issues associated with developing a control system for a serpentine robotic manipulator are explored. The controller developed provides autonomous control of the serpentine robotic manipulatorduring operation of the manipulator within an enclosed environment such as an underground storage tank. The controller algorithms make use of both low-level joint angle control employing force/position feedback constraints, and high-level coordinated control of end-effector positioning. This approach has resulted in both high-level full robotic control and low-level telerobotic control modes, and provides a high level of dexterity for the operator. |
Title: Power-law distributions in empirical data |
Abstract: Power-law distributions occur in many situations of scientific interest and have significant consequences for our understanding of natural and man-made phenomena. Unfortunately, the detection and characterization of power laws is complicated by the large fluctuations that occur in the tail of the distribution -- the part of the distribution representing large but rare events -- and by the difficulty of identifying the range over which power-law behavior holds. Commonly used methods for analyzing power-law data, such as least-squares fitting, can produce substantially inaccurate estimates of parameters for power-law distributions, and even in cases where such methods return accurate answers they are still unsatisfactory because they give no indication of whether the data obey a power law at all. Here we present a principled statistical framework for discerning and quantifying power-law behavior in empirical data. Our approach combines maximum-likelihood fitting methods with goodness-of-fit tests based on the Kolmogorov-Smirnov statistic and likelihood ratios. We evaluate the effectiveness of the approach with tests on synthetic data and give critical comparisons to previous approaches. We also apply the proposed methods to twenty-four real-world data sets from a range of different disciplines, each of which has been conjectured to follow a power-law distribution. In some cases we find these conjectures to be consistent with the data while in others the power law is ruled out. |
Title: Automatically Restructuring Practice Guidelines using the GEM DTD |
Abstract: This paper describes a system capable of semi-automatically filling an XML template from free texts in the clinical domain (practice guidelines). The XML template includes semantic information not explicitly encoded in the text (pairs of conditions and actions/recommendations). Therefore, there is a need to compute the exact scope of conditions over text sequences expressing the required actions. We present a system developed for this task. We show that it yields good performance when applied to the analysis of French practice guidelines. |
Title: Bayesian Covariance Matrix Estimation using a Mixture of Decomposable Graphical Models |
Abstract: A Bayesian approach is used to estimate the covariance matrix of Gaussian data. Ideas from Gaussian graphical models and model selection are used to construct a prior for the covariance matrix that is a mixture over all decomposable graphs. For this prior the probability of each graph size is specified by the user and graphs of equal size are assigned equal probability. Most previous approaches assume that all graphs are equally probable. We show empirically that the prior that assigns equal probability over graph sizes outperforms the prior that assigns equal probability over all graphs, both in identifying the correct decomposable graph and in more efficiently estimating the covariance matrix. |
Title: Temporal Reasoning without Transitive Tables |
Abstract: Representing and reasoning about qualitative temporal information is an essential part of many artificial intelligence tasks. Lots of models have been proposed in the litterature for representing such temporal information. All derive from a point-based or an interval-based framework. One fundamental reasoning task that arises in applications of these frameworks is given by the following scheme: given possibly indefinite and incomplete knowledge of the binary relationships between some temporal objects, find the consistent scenarii between all these objects. All these models require transitive tables -- or similarly inference rules-- for solving such tasks. We have defined an alternative model, S-languages - to represent qualitative temporal information, based on the only two relations of and . In this paper, we show how this model enables to avoid transitive tables or inference rules to handle this kind of problem. |
Title: Sensitivity of principal Hessian direction analysis |
Abstract: We provide sensitivity comparisons for two competing versions of the dimension reduction method principal Hessian directions (pHd). These comparisons consider the effects of small perturbations on the estimation of the dimension reduction subspace via the influence function. We show that the two versions of pHd can behave completely differently in the presence of certain observational types. Our results also provide evidence that outliers in the traditional sense may or may not be highly influential in practice. Since influential observations may lurk within otherwise typical data, we consider the influence function in the empirical setting for the efficient detection of influential observations in practice. |
Title: Coherence and phase synchronization: generalization to pairs of multivariate time series, and removal of zero-lag contributions |
Abstract: Coherence and phase synchronization between time series corresponding to different spatial locations are usually interpreted as indicators of the connectivity between locations. In neurophysiology, time series of electric neuronal activity are essential for studying brain interconnectivity. Such signals can either be invasively measured from depth electrodes, or computed from very high time resolution, non-invasive, extracranial recordings of scalp electric potential differences (EEG: electroencephalogram) and magnetic fields (MEG: magnetoencephalogram) by means of a tomography such as sLORETA (standardized low resolution brain electromagnetic tomography). There are two problems in this case. First, in the usual situation of unknown cortical geometry, the estimated signal at each brain location is a vector with three components (i.e. a current density vector), which means that coherence and phase synchronization must be generalized to pairs of multivariate time series. Second, the inherent low spatial resolution of the EEG/MEG tomography introduces artificially high zero-lag coherence and phase synchronization. In this report, solutions to both problems are presented. Two additional generalizations are briefly mentioned: (1) conditional coherence and phase synchronization; and (2) non-stationary time-frequency analysis. Finally, a non-parametric randomization method for connectivity significance testing is outlined. The new connectivity measures proposed here can be applied to pairs of univariate EEG/MEG signals, as is traditional in the published literature. However, these calculations cannot be interpreted as connectivity, since it is in general incorrect to associate an extracranial electrode or sensor to the underlying cortex. |
Title: Towards understanding and modelling office daily life |
Abstract: Measuring and modeling human behavior is a very complex task. In this paper we present our initial thoughts on modeling and automatic recognition of some human activities in an office. We argue that to successfully model human activities, we need to consider both individual behavior and group dynamics. To demonstrate these theoretical approaches, we introduce an experimental system for analyzing everyday activity in our office. |
Title: Getting started in probabilistic graphical models |
Abstract: Probabilistic graphical models (PGMs) have become a popular tool for computational analysis of biological data in a variety of domains. But, what exactly are they and how do they work? How can we use PGMs to discover patterns that are biologically relevant? And to what extent can PGMs help us formulate new hypotheses that are testable at the bench? This note sketches out some answers and illustrates the main ideas behind the statistical approach to biological pattern discovery. |
Title: Some questions of Monte-Carlo modeling on nontrivial bundles |
Abstract: In this work are considered some questions of Monte-Carlo modeling on nontrivial bundles. As a basic example is used problem of generation of straight lines in 3D space, related with modeling of interaction of a solid body with a flux of particles and with some other tasks. Space of lines used in given model is example of nontrivial fiber bundle, that is equivalent with tangent sheaf of a sphere. |
Title: Statistical testing procedure for the interaction effects of several controllable factors in two-valued input-output systems |
Abstract: Suppose several two-valued input-output systems are designed by setting the levels of several controllable factors. For this situation, Taguchi method has proposed to assign the controllable factors to the orthogonal array and use ANOVA model for the standardized SN ratio, which is a natural measure for evaluating the performance of each input-output system. Though this procedure is simple and useful in application indeed, the result can be unreliable when the estimated standard errors of the standardized SN ratios are unbalanced. In this paper, we treat the data arising from the full factorial or fractional factorial designs of several controllable factors as the frequencies of high-dimensional contingency tables, and propose a general testing procedure for the main effects or the interaction effects of the controllable factors. |
Title: A tutorial on conformal prediction |
Abstract: Conformal prediction uses past experience to determine precise levels of confidence in new predictions. Given an error probability $\epsilon$, together with a method that makes a prediction $$ of a label $y$, it produces a set of labels, typically containing $$, that also contains $y$ with probability $1-\epsilon$. Conformal prediction can be applied to any method for producing $$: a nearest-neighbor method, a support-vector machine, ridge regression, etc. Conformal prediction is designed for an on-line setting in which labels are predicted successively, each one being revealed before the next is predicted. The most novel and valuable feature of conformal prediction is that if the successive examples are sampled independently from the same distribution, then the successive predictions will be right $1-\epsilon$ of the time, even though they are based on an accumulating dataset rather than on independent datasets. In addition to the model under which successive examples are sampled independently, other on-line compression models can also use conformal prediction. The widely used Gaussian linear model is one of these. This tutorial presents a self-contained account of the theory of conformal prediction and works through several numerical examples. A more comprehensive treatment of the topic is provided in "Algorithmic Learning in a Random World", by Vladimir Vovk, Alex Gammerman, and Glenn Shafer (Springer, 2005). |
Title: Separating populations with wide data: A spectral analysis |
Abstract: In this paper, we consider the problem of partitioning a small data sample drawn from a mixture of $k$ product distributions. We are interested in the case that individual features are of low average quality $\gamma$, and we want to use as few of them as possible to correctly partition the sample. We analyze a spectral technique that is able to approximately optimize the total data size--the product of number of data points $n$ and the number of features $K$--needed to correctly perform this partitioning as a function of $1/\gamma$ for $K>n$. Our goal is motivated by an application in clustering individuals according to their population of origin using markers, when the divergence between any two of the populations is small. |
Title: Undercomplete Blind Subspace Deconvolution via Linear Prediction |
Abstract: We present a novel solution technique for the blind subspace deconvolution (BSSD) problem, where temporal convolution of multidimensional hidden independent components is observed and the task is to uncover the hidden components using the observation only. We carry out this task for the undercomplete case (uBSSD): we reduce the original uBSSD task via linear prediction to independent subspace analysis (ISA), which we can solve. As it has been shown recently, applying temporal concatenation can also reduce uBSSD to ISA, but the associated ISA problem can easily become `high dimensional' [1]. The new reduction method circumvents this dimensionality problem. We perform detailed studies on the efficiency of the proposed technique by means of numerical simulations. We have found several advantages: our method can achieve high quality estimations for smaller number of samples and it can cope with deeper temporal convolutions. |
Title: The SSM Toolbox for Matlab |
Abstract: State Space Models (SSM) is a MATLAB 7.0 software toolbox for doing time series analysis by state space methods. The software features fully interactive construction and combination of models, with support for univariate and multivariate models, complex time-varying (dynamic) models, non-Gaussian models, and various standard models such as ARIMA and structural time-series models. The software includes standard functions for Kalman filtering and smoothing, simulation smoothing, likelihood evaluation, parameter estimation, signal extraction and forecasting, with incorporation of exact initialization for filters and smoothers, and support for missing observations and multiple time series input with common analysis structure. The software also includes implementations of TRAMO model selection and Hillmer-Tiao decomposition for ARIMA models. The software will provide a general toolbox for doing time series analysis on the MATLAB platform, allowing users to take advantage of its readily available graph plotting and general matrix computation capabilities. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.