text
stringlengths 0
4.09k
|
|---|
Title: A note on extension of sliced average variance estimation to multivariate regression
|
Abstract: This paper has been withdrawn because the Editor of Electronical Journal of Statistics declined the paper.
|
Title: Random projection trees for vector quantization
|
Abstract: A simple and computationally efficient scheme for tree-structured vector quantization is presented. Unlike previous methods, its quantization error depends only on the intrinsic dimension of the data distribution, rather than the apparent dimension of the space in which the data happen to lie.
|
Title: Adaptive estimation of a distribution function and its density in sup-norm loss by wavelet and spline projections
|
Abstract: Given an i.i.d. sample from a distribution $F$ on $$ with uniformly continuous density $p_0$, purely data-driven estimators are constructed that efficiently estimate $F$ in sup-norm loss and simultaneously estimate $p_0$ at the best possible rate of convergence over H\"older balls, also in sup-norm loss. The estimators are obtained by applying a model selection procedure close to Lepski's method with random thresholds to projections of the empirical measure onto spaces spanned by wavelets or $B$-splines. The random thresholds are based on suprema of Rademacher processes indexed by wavelet or spline projection kernels. This requires Bernstein-type analogs of the inequalities in Koltchinskii [Ann. Statist. 34 (2006) 2593-2656] for the deviation of suprema of empirical processes from their Rademacher symmetrizations.
|
Title: Uniform limit theorems for wavelet density estimators
|
Abstract: Let $p_n(y)=\sum_k_k\phi(y-k)+\sum_l=0^j_n-1\sum_k\hat \beta_lk2^l/2\psi(2^ly-k)$ be the linear wavelet density estimator, where $\phi$, $\psi$ are a father and a mother wavelet (with compact support), $_k$, $_lk$ are the empirical wavelet coefficients based on an i.i.d. sample of random variables distributed according to a density $p_0$ on $$, and $j_n\in$, $j_n\nearrow\infty$. Several uniform limit theorems are proved: First, the almost sure rate of convergence of $\sup_y\in|p_n(y)-Ep_n(y)|$ is obtained, and a law of the logarithm for a suitably scaled version of this quantity is established. This implies that $\sup_y\in|p_n(y)-p_0(y)|$ attains the optimal almost sure rate of convergence for estimating $p_0$, if $j_n$ is suitably chosen. Second, a uniform central limit theorem as well as strong invariance principles for the distribution function of $p_n$, that is, for the stochastic processes $(F_n ^W(s)-F(s))=\int_-\infty^s(p_n-p_0),s\in$, are proved; and more generally, uniform central limit theorems for the processes $\int(p_n-p_0)f$, $f\in$, for other Donsker classes $$ of interest are considered. As a statistical application, it is shown that essentially the same limit theorems can be obtained for the hard thresholding wavelet estimator introduced by Donoho et al. [Ann. Statist. 24 (1996) 508--539].
|
Title: A Fast Algorithm and Datalog Inexpressibility for Temporal Reasoning
|
Abstract: We introduce a new tractable temporal constraint language, which strictly contains the Ord-Horn language of Buerkert and Nebel and the class of AND/OR precedence constraints. The algorithm we present for this language decides whether a given set of constraints is consistent in time that is quadratic in the input size. We also prove that (unlike Ord-Horn) this language cannot be solved by Datalog or by establishing local consistency.
|
Title: On-line Learning of an Unlearnable True Teacher through Mobile Ensemble Teachers
|
Abstract: On-line learning of a hierarchical learning model is studied by a method from statistical mechanics. In our model a student of a simple perceptron learns from not a true teacher directly, but ensemble teachers who learn from the true teacher with a perceptron learning rule. Since the true teacher and the ensemble teachers are expressed as non-monotonic perceptron and simple ones, respectively, the ensemble teachers go around the unlearnable true teacher with the distance between them fixed in an asymptotic steady state. The generalization performance of the student is shown to exceed that of the ensemble teachers in a transient state, as was shown in similar ensemble-teachers models. Further, it is found that moving the ensemble teachers even in the steady state, in contrast to the fixed ensemble teachers, is efficient for the performance of the student.
|
Title: Overall and Pairwise Segregation Tests Based on Nearest Neighbor Contingency Tables
|
Abstract: Multivariate interaction between two or more classes (or species) has important consequences in many fields and causes multivariate clustering patterns such as segregation or association. The spatial segregation occurs when members of a class tend to be found near members of the same class (i.e., near conspecifics) while spatial association occurs when members of a class tend to be found near members of the other class or classes. These patterns can be studied using a nearest neighbor contingency table (NNCT). The null hypothesis is randomness in the nearest neighbor (NN) structure, which may result from -- among other patterns -- random labeling (RL) or complete spatial randomness (CSR) of points from two or more classes (which is called the CSR independence, henceforth). In this article, we introduce new versions of overall and cell-specific tests based on NNCTs (i.e., NNCT-tests) and compare them with Dixon's overall and cell-specific tests. These NNCT-tests provide information on the spatial interaction between the classes at small scales (i.e., around the average NN distances between the points). Overall tests are used to detect any deviation from the null case, while the cell-specific tests are post hoc pairwise spatial interaction tests that are applied when the overall test yields a significant result. We analyze the distributional properties of these tests; assess the finite sample performance of the tests by an extensive Monte Carlo simulation study. Furthermore, we show that the new NNCT-tests have better performance in terms of Type I error and power. We also illustrate these NNCT-tests on two real life data sets.
|
Title: Swarm-Based Spatial Sorting
|
Abstract: Purpose: To present an algorithm for spatially sorting objects into an annular structure. Design/Methodology/Approach: A swarm-based model that requires only stochastic agent behaviour coupled with a pheromone-inspired "attraction-repulsion" mechanism. Findings: The algorithm consistently generates high-quality annular structures, and is particularly powerful in situations where the initial configuration of objects is similar to those observed in nature. Research limitations/implications: Experimental evidence supports previous theoretical arguments about the nature and mechanism of spatial sorting by insects. Practical implications: The algorithm may find applications in distributed robotics. Originality/value: The model offers a powerful minimal algorithmic framework, and also sheds further light on the nature of attraction-repulsion algorithms and underlying natural processes.
|
Title: Panel Cointegration with Global Stochastic Trends
|
Abstract: This paper studies estimation of panel cointegration models with cross-sectional dependence generated by unobserved global stochastic trends. The standard least squares estimator is, in general, inconsistent owing to the spuriousness induced by the unobservable I(1) trends. We propose two iterative procedures that jointly estimate the slope parameters and the stochastic trends. The resulting estimators are referred to respectively as CupBC (continuously-updated and bias-corrected) and the CupFM (continuously-updated and fully-modified) estimators. We establish their consistency and derive their limiting distributions. Both are asymptotically unbiased and asymptotically mixed normal and permit inference to be conducted using standard test statistics. The estimators are also valid when there are mixed stationary and non-stationary factors, as well as when the factors are all stationary.
|
Title: Distributed Self Management for Distributed Security Systems
|
Abstract: Distributed system as e.g. artificial immune systems, complex adaptive systems, or multi-agent systems are widely used in Computer Science, e.g. for network security, optimisations, or simulations. In these systems, small entities move through the network and perform certain tasks. At some time, the entities move to another place and require therefore information where to move is most profitable. Common used systems do not provide any information or use a centralised approach where a center delegates the entities. This article discusses whether small information about the neighbours enhances the performance of the overall system or not. Therefore, two information-protocols are introduced and analysed. In addition, the protocols are implemented and tested using the artificial immune system SANA that protects a network against intrusions.
|
Title: Next Challenges in Bringing Artificial Immune Systems to Production in Network Security
|
Abstract: The human immune system protects the human body against various pathogens like e.g. biological viruses and bacteria. Artificial immune systems reuse the architecture, organization, and workflows of the human immune system for various problems in computer science. In the network security, the artificial immune system is used to secure a network and its nodes against intrusions like viruses, worms, and trojans. However, these approaches are far away from production where they are academic proof-of-concept implementations or use only a small part to protect against a certain intrusion. This article discusses the required steps to bring artificial immune systems into production in the network security domain. It furthermore figures out the challenges and provides the description and results of the prototype of an artificial immune system, which is SANA called.
|
Title: A New Algorithm for Interactive Structural Image Segmentation
|
Abstract: This paper proposes a novel algorithm for the problem of structural image segmentation through an interactive model-based approach. Interaction is expressed in the model creation, which is done according to user traces drawn over a given input image. Both model and input are then represented by means of attributed relational graphs derived on the fly. Appearance features are taken into account as object attributes and structural properties are expressed as relational attributes. To cope with possible topological differences between both graphs, a new structure called the deformation graph is introduced. The segmentation process corresponds to finding a labelling of the input graph that minimizes the deformations introduced in the model when it is updated with input information. This approach has shown to be faster than other segmentation methods, with competitive output quality. Therefore, the method solves the problem of multiple label segmentation in an efficient way. Encouraging results on both natural and target-specific color images, as well as examples showing the reusability of the model, are presented and discussed.
|
Title: Confidence regions for the multinomial parameter with small sample size
|
Abstract: Consider the observation of n iid realizations of an experiment with d>1 possible outcomes, which corresponds to a single observation of a multinomial distribution M(n,p) where p is an unknown discrete distribution on 1,...,d. In many applications, the construction of a confidence region for p when n is small is crucial. This concrete challenging problem has a long history. It is well known that the confidence regions built from asymptotic statistics do not have good coverage when n is small. On the other hand, most available methods providing non-asymptotic regions with controlled coverage are limited to the binomial case d=2. In the present work, we propose a new method valid for any d>1. This method provides confidence regions with controlled coverage and small volume, and consists of the inversion of the "covering collection"' associated with level-sets of the likelihood. The behavior when d/n tends to infinity remains an interesting open problem beyond the scope of this work.
|
Title: Algorithms and Bounds for Rollout Sampling Approximate Policy Iteration
|
Abstract: Several approximate policy iteration schemes without value functions, which focus on policy representation using classifiers and address policy learning as a supervised learning problem, have been proposed recently. Finding good policies with such methods requires not only an appropriate classifier, but also reliable examples of best actions, covering the state space sufficiently. Up to this time, little work has been done on appropriate covering schemes and on methods for reducing the sample complexity of such methods, especially in continuous state spaces. This paper focuses on the simplest possible covering scheme (a discretized grid over the state space) and performs a sample-complexity comparison between the simplest (and previously commonly used) rollout sampling allocation strategy, which allocates samples equally at each state under consideration, and an almost as simple method, which allocates samples only as needed and requires significantly fewer samples.
|
Title: Rollout Sampling Approximate Policy Iteration
|
Abstract: Several researchers have recently investigated the connection between reinforcement learning and classification. We are motivated by proposals of approximate policy iteration schemes without value functions which focus on policy representation using classifiers and address policy learning as a supervised learning problem. This paper proposes variants of an improved policy iteration scheme which addresses the core sampling problem in evaluating a policy through simulation as a multi-armed bandit machine. The resulting algorithm offers comparable performance to the previous algorithm achieved, however, with significantly less computational effort. An order of magnitude improvement is demonstrated experimentally in two standard reinforcement learning domains: inverted pendulum and mountain-car.
|
Title: On the elicitation of continuous, symmetric, unimodal distributions
|
Abstract: In this brief note, we highlight some difficulties that can arise when fitting a continuous, symmetric, unimodal distribution to a set of expert's judgements. A simple analysis shows it is possible to fit a Cauchy distribution to an expert's beliefs when their beliefs actually follow a normal distribution. This example stresses the need for careful distribution fitting and for feedback to the expert about what the fitted distribution implies about their beliefs.
|
Title: Statistical inference under order restrictions on both rows and columns of a matrix, with an application in toxicology
|
Abstract: We present a general methodology for performing statistical inference on the components of a real-valued matrix parameter for which rows and columns are subject to order restrictions. The proposed estimation procedure is based on an iterative algorithm developed by Dykstra and Robertson (1982) for simple order restriction on rows and columns of a matrix. For any order restrictions on rows and columns of a matrix, sufficient conditions are derived for the algorithm to converge in a single application of row and column operations. The new algorithm is applicable to a broad collection of order restrictions. In practice, it is easy to design a study such that the sufficient conditions derived in this paper are satisfied. For instance, the sufficient conditions are satisfied in a balanced design. Using the estimation procedure developed in this article, a bootstrap test for order restrictions on rows and columns of a matrix is proposed. Computer simulations for ordinal data were performed to compare the proposed test with some existing test procedures in terms of size and power. The new methodology is illustrated by applying it to a set of ordinal data obtained from a toxicological study.
|
Title: Adaptive approximate Bayesian computation
|
Abstract: Sequential techniques can enhance the efficiency of the approximate Bayesian computation algorithm, as in Sisson et al.'s (2007) partial rejection control version. While this method is based upon the theoretical works of Del Moral et al. (2006), the application to approximate Bayesian computation results in a bias in the approximation to the posterior. An alternative version based on genuine importance sampling arguments bypasses this difficulty, in connection with the population Monte Carlo method of Cappe et al. (2004), and it includes an automatic scaling of the forward kernel. When applied to a population genetics example, it compares favourably with two other versions of the approximate algorithm.
|
Title: Estimation of population-level summaries in general semiparametric repeated measures regression models
|
Abstract: This paper considers a wide family of semiparametric repeated measures regression models, in which the main interest is on estimating population-level quantities such as mean, variance, probabilities etc. Examples of our framework include generalized linear models for clustered/longitudinal data, among many others. We derive plug-in kernel-based estimators of the population level quantities and derive their asymptotic distribution. An example involving estimation of the survival function of hemoglobin measures in the Kenya hemoglobin study data is presented to demonstrate our methodology.
|
Title: A nonparametric control chart based on the Mann-Whitney statistic
|
Abstract: Nonparametric or distribution-free charts can be useful in statistical process control problems when there is limited or lack of knowledge about the underlying process distribution. In this paper, a phase II Shewhart-type chart is considered for location, based on reference data from phase I analysis and the well-known Mann-Whitney statistic. Control limits are computed using Lugannani-Rice-saddlepoint, Edgeworth, and other approximations along with Monte Carlo estimation. The derivations take account of estimation and the dependence from the use of a reference sample. An illustrative numerical example is presented. The in-control performance of the proposed chart is shown to be much superior to the classical Shewhart $$ chart. Further comparisons on the basis of some percentiles of the out-of-control conditional run length distribution and the unconditional out-of-control ARL show that the proposed chart is almost as good as the Shewhart $$ chart for the normal distribution, but is more powerful for a heavy-tailed distribution such as the Laplace, or for a skewed distribution such as the Gamma. Interactive software, enabling a complete implementation of the chart, is made available on a website.
|
Title: Graph Algorithms for Improving Type-Logical Proof Search
|
Abstract: Proof nets are a graph theoretical representation of proofs in various fragments of type-logical grammar. In spite of this basis in graph theory, there has been relatively little attention to the use of graph theoretic algorithms for type-logical proof search. In this paper we will look at several ways in which standard graph theoretic algorithms can be used to restrict the search space. In particular, we will provide an O(n4) algorithm for selecting an optimal axiom link at any stage in the proof search as well as a O(kn3) algorithm for selecting the k best proof candidates.
|
Title: Toward Fuzzy block theory
|
Abstract: This study, fundamentals of fuzzy block theory, and its application in assessment of stability in underground openings, has surveyed. Using fuzzy topics and inserting them in to key block theory, in two ways, fundamentals of fuzzy block theory has been presented. In indirect combining, by coupling of adaptive Neuro Fuzzy Inference System (NFIS) and classic block theory, we could extract possible damage parts around a tunnel. In direct solution, some principles of block theory, by means of different fuzzy facets theory, were rewritten.
|
Title: A multilateral filtering method applied to airplane runway image
|
Abstract: By considering the features of the airport runway image filtering, an improved bilateral filtering method was proposed which can remove noise with edge preserving. Firstly the steerable filtering decomposition is used to calculate the sub-band parameters of 4 orients, and the texture feature matrix is then obtained from the sub-band local median energy. The texture similar, the spatial closer and the color similar functions are used to filter the image.The effect of the weighting function parameters is qualitatively analyzed also. In contrast with the standard bilateral filter and the simulation results for the real airport runway image show that the multilateral filtering is more effective than the standard bilateral filtering.
|
Title: Multiple testing procedures under confounding
|
Abstract: While multiple testing procedures have been the focus of much statistical research, an important facet of the problem is how to deal with possible confounding. Procedures have been developed by authors in genetics and statistics. In this chapter, we relate these proposals. We propose two new multiple testing approaches within this framework. The first combines sensitivity analysis methods with false discovery rate estimation procedures. The second involves construction of shrinkage estimators that utilize the mixture model for multiple testing. The procedures are illustrated with applications to a gene expression profiling experiment in prostate cancer.
|
Title: An optimization problem on the sphere
|
Abstract: We prove existence and uniqueness of the minimizer for the average geodesic distance to the points of a geodesically convex set on the sphere. This implies a corresponding existence and uniqueness result for an optimal algorithm for halfspace learning, when data and target functions are drawn from the uniform distribution.
|
Title: A Kernel Method for the Two-Sample Problem
|
Abstract: We propose a framework for analyzing and comparing distributions, allowing us to design statistical tests to determine if two samples are drawn from different distributions. Our test statistic is the largest difference in expectations over functions in the unit ball of a reproducing kernel Hilbert space (RKHS). We present two tests based on large deviation bounds for the test statistic, while a third is based on the asymptotic distribution of this statistic. The test statistic can be computed in quadratic time, although efficient linear time approximations are available. Several classical metrics on distributions are recovered when the function space used to compute the difference in expectations is allowed to be more general (eg. a Banach space). We apply our two-sample tests to a variety of problems, including attribute matching for databases using the Hungarian marriage method, where they perform strongly. Excellent performance is also obtained when comparing distributions over graphs, for which these are the first such tests.
|
Title: Analysis of hydrocyclone performance based on information granulation theory
|
Abstract: This paper describes application of information granulation theory, on the analysis of hydrocyclone perforamance. In this manner, using a combining of Self Organizing Map (SOM) and Neuro-Fuzzy Inference System (NFIS), crisp and fuzzy granules are obtained(briefly called SONFIS). Balancing of crisp granules and sub fuzzy granules, within non fuzzy information (initial granulation), is rendered in an open-close iteration. Using two criteria, "simplicity of rules "and "adaptive threoshold error level", stability of algorithm is guaranteed. Validation of the proposed method, on the data set of the hydrocyclone is rendered.
|
Title: Projected likelihood contrasts for testing homogeneity in finite mixture models with nuisance parameters
|
Abstract: This paper develops a test for homogeneity in finite mixture models where the mixing proportions are known a priori (taken to be 0.5) and a common nuisance parameter is present. Statistical tests based on the notion of Projected Likelihood Contrasts (PLC) are considered. The PLC is a slight modification of the usual likelihood ratio statistic or the Wilk's $\Lambda$ and is similar in spirit to the Rao's score test. Theoretical investigations have been carried out to understand the large sample statistical properties of these tests. Simulation studies have been carried out to understand the behavior of the null distribution of the PLC statistic in the case of Gaussian mixtures with unknown means (common variance as nuisance parameter) and unknown variances (common mean as nuisance parameter). The results are in conformity with the theoretical results obtained. Power functions of these tests have been evaluated based on simulations from Gaussian mixtures.
|
Title: Correcting for selection bias via cross-validation in the classification of microarray data
|
Abstract: There is increasing interest in the use of diagnostic rules based on microarray data. These rules are formed by considering the expression levels of thousands of genes in tissue samples taken on patients of known classification with respect to a number of classes, representing, say, disease status or treatment strategy. As the final versions of these rules are usually based on a small subset of the available genes, there is a selection bias that has to be corrected for in the estimation of the associated error rates. We consider the problem using cross-validation. In particular, we present explicit formulae that are useful in explaining the layers of validation that have to be performed in order to avoid improperly cross-validated estimates.
|
Title: Model selection and sensitivity analysis for sequence pattern models
|
Abstract: In this article we propose a maximal a posteriori (MAP) criterion for model selection in the motif discovery problem and investigate conditions under which the MAP asymptotically gives a correct prediction of model size. We also investigate robustness of the MAP to prior specification and provide guidelines for choosing prior hyper-parameters for motif models based on sensitivity considerations.
|
Title: A toolkit for a generative lexicon
|
Abstract: In this paper we describe the conception of a software toolkit designed for the construction, maintenance and collaborative use of a Generative Lexicon. In order to ease its portability and spreading use, this tool was built with free and open source products. We eventually tested the toolkit and showed it filters the adequate form of anaphoric reference to the modifier in endocentric compounds.
|
Title: Increasing Linear Dynamic Range of Commercial Digital Photocamera Used in Imaging Systems with Optical Coding
|
Abstract: Methods of increasing linear optical dynamic range of commercial photocamera for optical-digital imaging systems are described. Use of such methods allows to use commercial photocameras for optical measurements. Experimental results are reported.
|
Title: Symmetry in Data Mining and Analysis: A Unifying View based on Hierarchy
|
Abstract: Data analysis and data mining are concerned with unsupervised pattern finding and structure determination in data sets. The data sets themselves are explicitly linked as a form of representation to an observational or otherwise empirical domain of interest. "Structure" has long been understood as symmetry which can take many forms with respect to any transformation, including point, translational, rotational, and many others. Beginning with the role of number theory in expressing data, we show how we can naturally proceed to hierarchical structures. We show how this both encapsulates traditional paradigms in data analysis, and also opens up new perspectives towards issues that are on the order of the day, including data mining of massive, high dimensional, heterogeneous data sets. Linkages with other fields are also discussed including computational logic and symbolic dynamics. The structures in data surveyed here are based on hierarchy, represented as p-adic numbers or an ultrametric topology.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.