text
stringlengths
0
4.09k
Title: P-values for classification
Abstract: Let $(X,Y)$ be a random variable consisting of an observed feature vector $X\in $ and an unobserved class label $Y\in \1,2,...,L\$ with unknown joint distribution. In addition, let $$ be a training data set consisting of $n$ completely observed independent copies of $(X,Y)$. Usual classification procedures provide point predictors (classifiers) $(X,)$ of $Y$ or estimate the conditional distribution of $Y$ given $X$. In order to quantify the certainty of classifying $X$ we propose to construct for each $\theta =1,2,...,L$ a p-value $\pi_\theta(X,)$ for the null hypothesis that $Y=\theta$, treating $Y$ temporarily as a fixed parameter. In other words, the point predictor $(X,)$ is replaced with a prediction region for $Y$ with a certain confidence. We argue that (i) this approach is advantageous over traditional approaches and (ii) any reasonable classifier can be modified to yield nonparametric p-values. We discuss issues such as optimality, single use and multiple use validity, as well as computational and graphical aspects.
Title: Analysis of Estimation of Distribution Algorithms and Genetic Algorithms on NK Landscapes
Abstract: This study analyzes performance of several genetic and evolutionary algorithms on randomly generated NK fitness landscapes with various values of n and k. A large number of NK problem instances are first generated for each n and k, and the global optimum of each instance is obtained using the branch-and-bound algorithm. Next, the hierarchical Bayesian optimization algorithm (hBOA), the univariate marginal distribution algorithm (UMDA), and the simple genetic algorithm (GA) with uniform and two-point crossover operators are applied to all generated instances. Performance of all algorithms is then analyzed and compared, and the results are discussed.
Title: iBOA: The Incremental Bayesian Optimization Algorithm
Abstract: This paper proposes the incremental Bayesian optimization algorithm (iBOA), which modifies standard BOA by removing the population of solutions and using incremental updates of the Bayesian network. iBOA is shown to be able to learn and exploit unrestricted Bayesian networks using incremental techniques for updating both the structure as well as the parameters of the probabilistic model. This represents an important step toward the design of competent incremental estimation of distribution algorithms that can solve difficult nearly decomposable problems scalably and reliably.
Title: From k-SAT to k-CSP: Two Generalized Algorithms
Abstract: Constraint satisfaction problems (CSPs) models many important intractable NP-hard problems such as propositional satisfiability problem (SAT). Algorithms with non-trivial upper bounds on running time for restricted SAT with bounded clause length k (k-SAT) can be classified into three styles: DPLL-like, PPSZ-like and Local Search, with local search algorithms having already been generalized to CSP with bounded constraint arity k (k-CSP). We generalize a DPLL-like algorithm in its simplest form and a PPSZ-like algorithm from k-SAT to k-CSP. As far as we know, this is the first attempt to use PPSZ-like strategy to solve k-CSP, and before little work has been focused on the DPLL-like or PPSZ-like strategies for k-CSP.
Title: Online-concordance "Perekhresni stezhky" ("The Cross-Paths"), a novel by Ivan Franko
Abstract: In the article, theoretical principles and practical realization for the compilation of the concordance to "Perekhresni stezhky" ("The Cross-Paths"), a novel by Ivan Franko, are described. Two forms for the context presentation are proposed. The electronic version of this lexicographic work is available online.
Title: On the condensed density of the generalized eigenvalues of pencils of Hankel Gaussian random matrices and applications
Abstract: Pencils of Hankel matrices whose elements have a joint Gaussian distribution with nonzero mean and not identical covariance are considered. An approximation to the distribution of the squared modulus of their determinant is computed which allows to get a closed form approximation of the condensed density of the generalized eigenvalues of the pencils. Implications of this result for solving several moments problems are discussed and some numerical examples are provided.
Title: Bayesian models to adjust for response bias in survey data for estimating rape and domestic violence rates from the NCVS
Abstract: It is difficult to accurately estimate the rates of rape and domestic violence due to the sensitive nature of these crimes. There is evidence that bias in estimating the crime rates from survey data may arise because some women respondents are "gagged" in reporting some types of crimes by the use of a telephone rather than a personal interview, and by the presence of a spouse during the interview. On the other hand, as data on these crimes are collected every year, it would be more efficient in data analysis if we could identify and make use of information from previous data. In this paper we propose a model to adjust the estimates of the rates of rape and domestic violence to account for the response bias due to the "gag" factors. To estimate parameters in the model, we identify the information that is not sensitive to time and incorporate this into prior distributions. The strength of Bayesian estimators is their ability to combine information from long observational records in a sensible way. Within a Bayesian framework, we develop an Expectation-Maximization-Bayesian (EMB) algorithm for computation in analyzing contingency table and we apply the jackknife to estimate the accuracy of the estimates. Our approach is illustrated using the yearly crime data from the National Crime Victimization Survey. The illustration shows that compared with the classical method, our model leads to more efficient estimation but does not require more complicated computation.
Title: On some difficulties with a posterior probability approximation technique
Abstract: In Scott (2002) and Congdon (2006), a new method is advanced to compute posterior probabilities of models under consideration. It is based solely on MCMC outputs restricted to single models, i.e., it is bypassing reversible jump and other model exploration techniques. While it is indeed possible to approximate posterior probabilities based solely on MCMC outputs from single models, as demonstrated by Gelfand and Dey (1994) and Bartolucci et al. (2006), we show that the proposals of Scott (2002) and Congdon (2006) are biased and advance several arguments towards this thesis, the primary one being the confusion between model-based posteriors and joint pseudo-posteriors. From a practical point of view, the bias in Scott's (2002) approximation appears to be much more severe than the one in Congdon's (2006), the later being often of the same magnitude as the posterior probability it approximates, although we also exhibit an example where the divergence from the true posterior probability is extreme.
Title: On the Effects of Idiotypic Interactions for Recommendation Communities in Artificial Immune Systems
Abstract: It has previously been shown that a recommender based on immune system idiotypic principles can out perform one based on correlation alone. This paper reports the results of work in progress, where we undertake some investigations into the nature of this beneficial effect. The initial findings are that the immune system recommender tends to produce different neighbourhoods, and that the superior performance of this recommender is due partly to the different neighbourhoods, and partly to the way that the idiotypic effect is used to weight each neighbours recommendations.
Title: A Recommender System based on the Immune Network
Abstract: The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature. This paper presents an artificial immune system (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by collaborative filtering (CF). Natural evolution and in particular the immune system have not been designed for classical optimisation. However, for this problem, we are not interested in finding a single optimum. Rather we intend to identify a sub-set of good matches on which recommendations can be based. It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen - antibody interaction for matching and antibody - antibody interaction for diversity. Computational results are presented in support of this conjecture and compared to those found by other CF techniques.
Title: The Danger Theory and Its Application to Artificial Immune Systems
Abstract: Over the last decade, a new idea challenging the classical self-non-self viewpoint has become popular amongst immunologists. It is called the Danger Theory. In this conceptual paper, we look at this theory from the perspective of Artificial Immune System practitioners. An overview of the Danger Theory is presented with particular emphasis on analogies in the Artificial Immune Systems world. A number of potential application areas are then used to provide a framing for a critical assessment of the concept, and its relevance for Artificial Immune Systems.
Title: Partnering Strategies for Fitness Evaluation in a Pyramidal Evolutionary Algorithm
Abstract: This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes for (sub-)fitness evaluation purposes are examined for two multiple-choice optimisation problems. It is shown that random partnering strategies perform best by providing better sampling and more diversity.
Title: A statistical analysis of probabilistic counting algorithms
Abstract: This paper considers the problem of cardinality estimation in data stream applications. We present a statistical analysis of probabilistic counting algorithms, focusing on two techniques that use pseudo-random variates to form low-dimensional data sketches. We apply conventional statistical methods to compare probabilistic algorithms based on storing either selected order statistics, or random projections. We derive estimators of the cardinality in both cases, and show that the maximal-term estimator is recursively computable and has exponentially decreasing error bounds. Furthermore, we show that the estimators have comparable asymptotic efficiency, and explain this result by demonstrating an unexpected connection between the two approaches.
Title: Efficient l_alpha Distance Approximation for High Dimensional Data Using alpha-Stable Projection
Abstract: In recent years, large high-dimensional data sets have become commonplace in a wide range of applications in science and commerce. Techniques for dimension reduction are of primary concern in statistical analysis. Projection methods play an important role. We investigate the use of projection algorithms that exploit properties of the alpha-stable distributions. We show that l_alpha distances and quasi-distances can be recovered from random projections with full statistical efficiency by L-estimation. The computational requirements of our algorithm are modest; after a once-and-for-all calculation to determine an array of length k, the algorithm runs in O(k) time for each distance, where k is the reduced dimension of the projection.
Title: A path following algorithm for the graph matching problem
Abstract: We propose a convex-concave programming approach for the labeled weighted graph matching problem. The convex-concave programming formulation is obtained by rewriting the weighted graph matching problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems: a quadratic convex and a quadratic concave optimization problem on the set of doubly stochastic matrices. The concave relaxation has the same global minimum as the initial graph matching problem, but the search for its global minimum is also a hard combinatorial problem. We therefore construct an approximation of the concave problem solution by following a solution path of a convex-concave problem obtained by linear interpolation of the convex and concave formulations, starting from the convex relaxation. This method allows to easily integrate the information on graph label similarities into the optimization problem, and therefore to perform labeled weighted graph matching. The algorithm is compared with some of the best performing graph matching methods on four datasets: simulated graphs, QAPLib, retina vessel images and handwritten chinese characters. In all cases, the results are competitive with the state-of-the-art.
Title: Higher Accuracy for Bayesian and Frequentist Inference: Large Sample Theory for Small Sample Likelihood
Abstract: Recent likelihood theory produces $p$-values that have remarkable accuracy and wide applicability. The calculations use familiar tools such as maximum likelihood values (MLEs), observed information and parameter rescaling. The usual evaluation of such $p$-values is by simulations, and such simulations do verify that the global distribution of the $p$-values is uniform(0, 1), to high accuracy in repeated sampling. The derivation of the $p$-values, however, asserts a stronger statement, that they have a uniform(0, 1) distribution conditionally, given identified precision information provided by the data. We take a simple regression example that involves exact precision information and use large sample techniques to extract highly accurate information as to the statistical position of the data point with respect to the parameter: specifically, we examine various $p$-values and Bayesian posterior survivor $s$-values for validity. With observed data we numerically evaluate the various $p$-values and $s$-values, and we also record the related general formulas. We then assess the numerical values for accuracy using Markov chain Monte Carlo (McMC) methods. We also propose some third-order likelihood-based procedures for obtaining means and variances of Bayesian posterior distributions, again followed by McMC assessment. Finally we propose some adaptive McMC methods to improve the simulation acceptance rates. All these methods are based on asymptotic analysis that derives from the effect of additional data. And the methods use simple calculations based on familiar maximizing values and related informations. The example illustrates the general formulas and the ease of calculations, while the McMC assessments demonstrate the numerical validity of the $p$-values as percentage position of a data point. The example, however, is very simple and transparent, and thus gives little indication that in a wide generality of models the formulas do accurately separate information for almost any parameter of interest, and then do give accurate $p$-value determinations from that information. As illustration an enigmatic problem in the literature is discussed and simulations are recorded; various examples in the literature are cited.
Title: Robustness Evaluation of Two CCG, a PCFG and a Link Grammar Parsers
Abstract: Robustness in a parser refers to an ability to deal with exceptional phenomena. A parser is robust if it deals with phenomena outside its normal range of inputs. This paper reports on a series of robustness evaluations of state-of-the-art parsers in which we concentrated on one aspect of robustness: its ability to parse sentences containing misspelled words. We propose two measures for robustness evaluation based on a comparison of a parser's output for grammatical input sentences and their noisy counterparts. In this paper, we use these measures to compare the overall robustness of the four evaluated parsers, and we present an analysis of the decline in parser performance with increasing error levels. Our results indicate that performance typically declines tens of percentage units when parsers are presented with texts containing misspellings. When it was tested on our purpose-built test set of 443 sentences, the best parser in the experiment (C&C parser) was able to return exactly the same parse tree for the grammatical and ungrammatical sentences for 60.8%, 34.0% and 14.9% of the sentences with one, two or three misspelled words respectively.
Title: Between conjecture and memento: shaping a collective emotional perception of the future
Abstract: Large scale surveys of public mood are costly and often impractical to perform. However, the web is awash with material indicative of public mood such as blogs, emails, and web queries. Inexpensive content analysis on such extensive corpora can be used to assess public mood fluctuations. The work presented here is concerned with the analysis of the public mood towards the future. Using an extension of the Profile of Mood States questionnaire, we have extracted mood indicators from 10,741 emails submitted in 2006 to futureme.org, a web service that allows its users to send themselves emails to be delivered at a later date. Our results indicate long-term optimism toward the future, but medium-term apprehension and confusion.
Title: On the Scaling Window of Model RB
Abstract: This paper analyzes the scaling window of a random CSP model (i.e. model RB) for which we can identify the threshold points exactly, denoted by $r_cr$ or $p_cr$. For this model, we establish the scaling window $W(n,\delta)=(r_-(n,\delta), r_+(n,\delta))$ such that the probability of a random instance being satisfiable is greater than $1-\delta$ for $r<r_-(n,\delta)$ and is less than $\delta$ for $r>r_+(n,\delta)$. Specifically, we obtain the following result $$W(n,\delta)=(r_cr-\Theta(n^1-\epsilon\ln n), \ r_cr+\Theta(n\ln n)),$$ where $0\leq\epsilon<1$ is a constant. A similar result with respect to the other parameter $p$ is also obtained. Since the instances generated by model RB have been shown to be hard at the threshold, this is the first attempt, as far as we know, to analyze the scaling window of such a model with hard instances.
Title: Properties of Nested Sampling
Abstract: Nested sampling is a simulation method for approximating marginal likelihoods proposed by Skilling (2006). We establish that nested sampling has an approximation error that vanishes at the standard Monte Carlo rate and that this error is asymptotically Gaussian. We show that the asymptotic variance of the nested sampling approximation typically grows linearly with the dimension of the parameter. We discuss the applicability and efficiency of nested sampling in realistic problems, and we compare it with two current methods for computing marginal likelihood. We propose an extension that avoids resorting to Markov chain Monte Carlo to obtain the simulated points.
Title: A Tribute to Ingram Olkin
Abstract: It is with pleasure and pride that I introduce this special section in honor of Ingram Olkin. This tribute is especially fitting because, among the many profound and far-reaching contributions that he has made to our profession, Ingram Olkin was the key force behind the genesis of Statistical Science. As put so eloquently by Morrie DeGroot [1], the founding Executive Editor of Statistical Science.
Title: The optimal assignment kernel is not positive definite
Abstract: We prove that the optimal assignment kernel, proposed recently as an attempt to embed labeled graphs and more generally tuples of basic data to a Hilbert space, is in fact not always positive definite.
Title: A Semi-parametric Technique for the Quantitative Analysis of Dynamic Contrast-enhanced MR Images Based on Bayesian P-splines
Abstract: Dynamic Contrast-enhanced Magnetic Resonance Imaging (DCE-MRI) is an important tool for detecting subtle kinetic changes in cancerous tissue. Quantitative analysis of DCE-MRI typically involves the convolution of an arterial input function (AIF) with a nonlinear pharmacokinetic model of the contrast agent concentration. Parameters of the kinetic model are biologically meaningful, but the optimization of the non-linear model has significant computational issues. In practice, convergence of the optimization algorithm is not guaranteed and the accuracy of the model fitting may be compromised. To overcome this problems, this paper proposes a semi-parametric penalized spline smoothing approach, with which the AIF is convolved with a set of B-splines to produce a design matrix using locally adaptive smoothing parameters based on Bayesian penalized spline models (P-splines). It has been shown that kinetic parameter estimation can be obtained from the resulting deconvolved response function, which also includes the onset of contrast enhancement. Detailed validation of the method, both with simulated and in vivo data, is provided.
Title: Computational aspects and applications of a new transform for solving the complex exponentials approximation problem
Abstract: Many real life problems can be reduced to the solution of a complex exponentials approximation problem which is usually ill posed. Recently a new transform for solving this problem, formulated as a specific moments problem in the plane, has been proposed in a theoretical framework. In this work some computational issues are addressed to make this new tool useful in practice. An algorithm is developed and used to solve a Nuclear Magnetic Resonance spectrometry problem, two time series interpolation and extrapolation problems and a shape from moments problem.
Title: Multivariate Meta-Analysis: Contributions of Ingram Olkin
Abstract: The research on meta-analysis and particularly multivariate meta-analysis has been greatly influenced by the work of Ingram Olkin. This paper documents Olkin's contributions by way of citation counts and outlines several areas of contribution by Olkin and his academic descendants. An academic family tree is provided.
Title: Majorization: Here, There and Everywhere
Abstract: The appearance of Marshall and Olkin's 1979 book on inequalities with special emphasis on majorization generated a surge of interest in potential applications of majorization and Schur convexity in a broad spectrum of fields. After 25 years this continues to be the case. The present article presents a sampling of the diverse areas in which majorization has been found to be useful in the past 25 years.
Title: Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing
Abstract: In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreover, in normal linear models scenarios, they exactly reproduce Jeffreys-Zellner-Siow priors. Most importantly, in challenging scenarios such as irregular models and mixture models, the DB priors are well defined and very reasonable, while alternative proposals are not. We derive approximations to the DB priors as well as MCMC and asymptotic expressions for the associated Bayes factors.
Title: A.-M. Guerry's Moral Statistics of France: Challenges for Multivariable Spatial Analysis
Abstract: Andr\'e-Michel Guerry's (1833) Essai sur la Statistique Morale de la France was one of the foundation studies of modern social science. Guerry assembled data on crimes, suicides, literacy and other ``moral statistics,'' and used tables and maps to analyze a variety of social issues in perhaps the first comprehensive study relating such variables. Indeed, the Essai may be considered the book that launched modern empirical social science, for the questions raised and the methods Guerry developed to try to answer them. Guerry's data consist of a large number of variables recorded for each of the d\'epartments of France in the 1820--1830s and therefore involve both multivariate and geographical aspects. In addition to historical interest, these data provide the opportunity to ask how modern methods of statistics, graphics, thematic cartography and geovisualization can shed further light on the questions he raised. We present a variety of methods attempting to address Guerry's challenge for multivariate spatial statistics.
Title: Movie Recommendation Systems Using An Artificial Immune System
Abstract: We apply the Artificial Immune System (AIS) technology to the Collaborative Filtering (CF) technology when we build the movie recommendation system. Two different affinity measure algorithms of AIS, Kendall tau and Weighted Kappa, are used to calculate the correlation coefficients for this movie recommendation system. From the testing we think that Weighted Kappa is more suitable than Kendall tau for movie problems.
Title: On Affinity Measures for Artificial Immune System Movie Recommenders
Abstract: We combine Artificial Immune Systems 'AIS', technology with Collaborative Filtering 'CF' and use it to build a movie recommendation system. We already know that Artificial Immune Systems work well as movie recommenders from previous work by Cayzer and Aickelin 3, 4, 5. Here our aim is to investigate the effect of different affinity measure algorithms for the AIS. Two different affinity measures, Kendalls Tau and Weighted Kappa, are used to calculate the correlation coefficients for the movie recommender. We compare the results with those published previously and show that Weighted Kappa is more suitable than others for movie problems. We also show that AIS are generally robust movie recommenders and that, as long as a suitable affinity measure is chosen, results are good.
Title: Artificial Immune Systems (AIS) - A New Paradigm for Heuristic Decision Making
Abstract: Over the last few years, more and more heuristic decision making techniques have been inspired by nature, e.g. evolutionary algorithms, ant colony optimisation and simulated annealing. More recently, a novel computational intelligence technique inspired by immunology has emerged, called Artificial Immune Systems (AIS). This immune system inspired technique has already been useful in solving some computational problems. In this keynote, we will very briefly describe the immune system metaphors that are relevant to AIS. We will then give some illustrative real-world problems suitable for AIS use and show a step-by-step algorithm walkthrough. A comparison of AIS to other well-known algorithms and areas for future work will round this keynote off. It should be noted that as AIS is still a young and evolving field, there is not yet a fixed algorithm template and hence actual implementations might differ somewhat from the examples given here.
Title: TER: A Robot for Remote Ultrasonic Examination: Experimental Evaluations
Abstract: This chapter: o Motivates the clinical use of robotic tele-echography o Introduces the TER system o Describes technical and clinical evaluations performed with TER
Title: Fully Bayes factors with a generalized g-prior
Abstract: For the normal linear model variable selection problem, we propose selection criteria based on a fully Bayes formulation with a generalization of Zellner's $g$-prior which allows for $p>n$. A special case of the prior formulation is seen to yield tractable closed forms for marginal densities and Bayes factors which reveal new model evaluation characteristics of potential interest.