text
stringlengths
0
4.09k
Title: Quiescence of Self-stabilizing Gossiping among Mobile Agents in Graphs
Abstract: This paper considers gossiping among mobile agents in graphs: agents move on the graph and have to disseminate their initial information to every other agent. We focus on self-stabilizing solutions for the gossip problem, where agents may start from arbitrary locations in arbitrary states. Self-stabilization requires (some of the) participating agents to keep moving forever, hinting at maximizing the number of agents that could be allowed to stop moving eventually. This paper formalizes the self-stabilizing agent gossip problem, introduces the quiescence number (i.e., the maximum number of eventually stopping agents) of self-stabilizing solutions and investigates the quiescence number with respect to several assumptions related to agent anonymity, synchrony, link duplex capacity, and whiteboard capacity.
Title: Acquisition Accuracy Evaluation in Visual Inspection Systems - a Practical Approach
Abstract: This paper draws a proposal of a set of parameters and methods for accuracy evaluation of visual inspection systems. The case of a monochrome board is treated, but practically all conclusions and methods may be extended for colour acquisition. Basically, the proposed parameters are grouped in five sets as follows:Internal noise;Video ADC cuantisation parameters;Analogue processing section parameters;Dominant frequencies;Synchronisation (lock-in) accuracy. On basis of this set of parameters was developed a software environment, in conjunction with a test signal generator that allows the "test" images. The paper also presents conclusions of evaluation for two types of video acquisition boards
Title: Multiscale Inference for High-Frequency Data
Abstract: This paper proposes a novel multiscale estimator for the integrated volatility of an Ito process, in the presence of market microstructure noise (observation error). The multiscale structure of the observed process is represented frequency-by-frequency and the concept of the multiscale ratio is introduced to quantify the bias in the realized integrated volatility due to the observation error. The multiscale ratio is estimated from a single sample path, and a frequency-by-frequency bias correction procedure is proposed, which simultaneously reduces variance. We extend the method to include correlated observation errors and provide the implied time domain form of the estimation procedure. The new method is implemented to estimate the integrated volatility for the Heston and other models, and the improved performance of our method over existing methods is illustrated by simulation studies.
Title: An EM algorithm for estimation in the Mixture Transition Distribution model
Abstract: The Mixture Transition Distribution (MTD) model was introduced by Raftery to face the need for parsimony in the modeling of high-order Markov chains in discrete time. The particularity of this model comes from the fact that the effect of each lag upon the present is considered separately and additively, so that the number of parameters required is drastically reduced. However, the efficiency for the MTD parameter estimations proposed up to date still remains problematic on account of the large number of constraints on the parameters. In this paper, an iterative procedure, commonly known as Expectation-Maximization (EM) algorithm, is developed cooperating with the principle of Maximum Likelihood Estimation (MLE) to estimate the MTD parameters. Some applications of modeling MTD show the proposed EM algorithm is easier to be used than the algorithm developed by Berchtold. Moreover, the EM Estimations of parameters for high-order MTD models led on DNA sequences outperform the corresponding fully parametrized Markov chain in terms of Bayesian Information Criterion. A software implementation of our algorithm is available in the library seq++ at http://stat.genopole.cnrs.fr/seqpp
Title: The Rank of the Covariance Matrix of an Evanescent Field
Abstract: Evanescent random fields arise as a component of the 2-D Wold decomposition of homogenous random fields. Besides their theoretical importance, evanescent random fields have a number of practical applications, such as in modeling the observed signal in the space time adaptive processing (STAP) of airborne radar data. In this paper we derive an expression for the rank of the low-rank covariance matrix of a finite dimension sample from an evanescent random field. It is shown that the rank of this covariance matrix is completely determined by the evanescent field spectral support parameters, alone. Thus, the problem of estimating the rank lends itself to a solution that avoids the need to estimate the rank from the sample covariance matrix. We show that this result can be immediately applied to considerably simplify the estimation of the rank of the interference covariance matrix in the STAP problem.
Title: What Can We Learn Privately?
Abstract: Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in contexts where aggregate information is released about a database containing sensitive information about individuals. We demonstrate that, ignoring computational constraints, it is possible to privately agnostically learn any concept class using a sample size approximately logarithmic in the cardinality of the concept class. Therefore, almost anything learnable is learnable privately: specifically, if a concept class is learnable by a (non-private) algorithm with polynomial sample complexity and output size, then it can be learned privately using a polynomial number of samples. We also present a computationally efficient private PAC learner for the class of parity functions. Local (or randomized response) algorithms are a practical class of private algorithms that have received extensive investigation. We provide a precise characterization of local private learning algorithms. We show that a concept class is learnable by a local algorithm if and only if it is learnable in the statistical query (SQ) model. Finally, we present a separation between the power of interactive and noninteractive local learning algorithms.
Title: New probabilistic interest measures for association rules
Abstract: Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a real-world database from a grocery outlet to explore the behavior of confidence and lift, two popular interest measures used for rule mining. The results show that confidence is systematically influenced by the frequency of the items in the left hand side of rules and that lift performs poorly to filter random noise in transaction data. Based on the probabilistic framework we develop two new interest measures, hyper-lift and hyper-confidence, which can be used to filter or order mined association rules. The new measures show significantly better performance than lift for applications where spurious rules are problematic.
Title: The Future of Scientific Simulations: from Artificial Life to Artificial Cosmogenesis
Abstract: This philosophical paper explores the relation between modern scientific simulations and the future of the universe. We argue that a simulation of an entire universe will result from future scientific activity. This requires us to tackle the challenge of simulating open-ended evolution at all levels in a single simulation. The simulation should encompass not only biological evolution, but also physical evolution (a level below) and cultural evolution (a level above). The simulation would allow us to probe what would happen if we would "replay the tape of the universe" with the same or different laws and initial conditions. We also distinguish between real-world and artificial-world modelling. Assuming that intelligent life could indeed simulate an entire universe, this leads to two tentative hypotheses. Some authors have argued that we may already be in a simulation run by an intelligent entity. Or, if such a simulation could be made real, this would lead to the production of a new universe. This last direction is argued with a careful speculative philosophical approach, emphasizing the imperative to find a solution to the heat death problem in cosmology. The reader is invited to consult Annex 1 for an overview of the logical structure of this paper. -- Keywords: far future, future of science, ALife, simulation, realization, cosmology, heat death, fine-tuning, physical eschatology, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis, selfish biocosm hypothesis, meduso-anthropic principle, developmental singularity hypothesis, role of intelligent life.
Title: Serious Flaws in Korf et al.'s Analysis on Time Complexity of A*
Abstract: This paper has been withdrawn.
Title: Non-Singular Assembly-mode Changing Motions for 3-RPR Parallel Manipulators
Abstract: When moving from one arbitrary location at another, a parallel manipulator may change its assembly-mode without crossing a singularity. Because the non-singular change of assembly-mode cannot be simply detected, the actual assembly-mode during motion is difficult to track. This paper proposes a global explanatory approach to help better understand non-singular assembly-mode changing motions for 3-RPR planar parallel manipulators. The approach consists in fixing one of the actuated joints and analyzing the configuration-space as a surface in a 3-dimensional space. Such a global description makes it possible to display all possible non-singular assembly-mode changing trajectories.
Title: Hybrid Reasoning and the Future of Iconic Representations
Abstract: We give a brief overview of the main characteristics of diagrammatic reasoning, analyze a case of human reasoning in a mastermind game, and explain why hybrid representation systems (HRS) are particularly attractive and promising for Artificial General Intelligence and Computer Science in general.
Title: Privacy Preserving ID3 over Horizontally, Vertically and Grid Partitioned Data
Abstract: We consider privacy preserving decision tree induction via ID3 in the case where the training data is horizontally or vertically distributed. Furthermore, we consider the same problem in the case where the data is both horizontally and vertically distributed, a situation we refer to as grid partitioned data. We give an algorithm for privacy preserving ID3 over horizontally partitioned data involving more than two parties. For grid partitioned data, we discuss two different evaluation methods for preserving privacy ID3, namely, first merging horizontally and developing vertically or first merging vertically and next developing horizontally. Next to introducing privacy preserving data mining over grid-partitioned data, the main contribution of this paper is that we show, by means of a complexity analysis that the former evaluation method is the more efficient.
Title: Dempster-Shafer for Anomaly Detection
Abstract: In this paper, we implement an anomaly detection system using the Dempster-Shafer method. Using two standard benchmark problems we show that by combining multiple signals it is possible to achieve better results than by using a single signal. We further show that by applying this approach to a real-world email dataset the algorithm works for email worm detection. Dempster-Shafer can be a promising method for anomaly detection problems with multiple features (data sources), and two or more classes.
Title: A class of statistical models to weaken independence in two-way contingency tables
Abstract: In this paper we study a new class of statistical models for contingency tables. We define this class of models through a subset of the binomial equations of the classical independence model. We use some notions from Algebraic Statistics to compute their sufficient statistic, and to prove that they are log-linear. Moreover, we show how to compute maximum likelihood estimates and to perform exact inference through the Diaconis-Sturmfels algorithm. Examples show that these models can be useful in a wide range of applications.
Title: Spatio-activity based object detection
Abstract: We present the SAMMI lightweight object detection method which has a high level of accuracy and robustness, and which is able to operate in an environment with a large number of cameras. Background modeling is based on DCT coefficients provided by cameras. Foreground detection uses similarity in temporal characteristics of adjacent blocks of pixels, which is a computationally inexpensive way to make use of object coherence. Scene model updating uses the approximated median method for improved performance. Evaluation at pixel level and application level shows that SAMMI object detection performs better and faster than the conventional Mixture of Gaussians method.
Title: Genetic-Algorithm Seeding Of Idiotypic Networks For Mobile-Robot Navigation
Abstract: Robot-control designers have begun to exploit the properties of the human immune system in order to produce dynamic systems that can adapt to complex, varying, real-world tasks. Jernes idiotypic-network theory has proved the most popular artificial-immune-system (AIS) method for incorporation into behaviour-based robotics, since idiotypic selection produces highly adaptive responses. However, previous efforts have mostly focused on evolving the network connections and have often worked with a single, pre-engineered set of behaviours, limiting variability. This paper describes a method for encoding behaviours as a variable set of attributes, and shows that when the encoding is used with a genetic algorithm (GA), multiple sets of diverse behaviours can develop naturally and rapidly, providing much greater scope for flexible behaviour-selection. The algorithm is tested extensively with a simulated e-puck robot that navigates around a maze by tracking colour. Results show that highly successful behaviour sets can be generated within about 25 minutes, and that much greater diversity can be obtained when multiple autonomous populations are used, rather than a single one.
Title: Component models for large networks
Abstract: Being among the easiest ways to find meaningful structure from discrete data, Latent Dirichlet Allocation (LDA) and related component models have been applied widely. They are simple, computationally fast and scalable, interpretable, and admit nonparametric priors. In the currently popular field of network modeling, relatively little work has taken uncertainty of data seriously in the Bayesian sense, and component models have been introduced to the field only recently, by treating each node as a bag of out-going links. We introduce an alternative, interaction component model for communities (ICMc), where the whole network is a bag of links, stemming from different components. The former finds both disassortative and assortative structure, while the alternative assumes assortativity and finds community-like structures like the earlier methods motivated by physics. With Dirichlet Process priors and an efficient implementation the models are highly scalable, as demonstrated with a social network from the Last.fm web site, with 670,000 nodes and 1.89 million links.
Title: Improved evolutionary generation of XSLT stylesheets
Abstract: This paper introduces a procedure based on genetic programming to evolve XSLT programs (usually called stylesheets or logicsheets). XSLT is a general purpose, document-oriented functional language, generally used to transform XML documents (or, in general, solve any problem that can be coded as an XML document). The proposed solution uses a tree representation for the stylesheets as well as diverse specific operators in order to obtain, in the studied cases and a reasonable time, a XSLT stylesheet that performs the transformation. Several types of representation have been compared, resulting in different performance and degree of success.
Title: Danger Theory: The Link between AIS and IDS?
Abstract: We present ideas about creating a next generation Intrusion Detection System based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems. The Human Immune System can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System for our computers.
Title: An Ant-Based Model for Multiple Sequence Alignment
Abstract: Multiple sequence alignment is a key process in today's biology, and finding a relevant alignment of several sequences is much more challenging than just optimizing some improbable evaluation functions. Our approach for addressing multiple sequence alignment focuses on the building of structures in a new graph model: the factor graph model. This model relies on block-based formulation of the original problem, formulation that seems to be one of the most suitable ways for capturing evolutionary aspects of alignment. The structures are implicitly built by a colony of ants laying down pheromones in the factor graphs, according to relations between blocks belonging to the different sequences.
Title: Locally D-optimal designs based on a class of composed models resulted from blending Emax and one-compartment models
Abstract: A class of nonlinear models combining a pharmacokinetic compartmental model and a pharmacodynamic Emax model is introduced. The locally D-optimal (LD) design for a four-parameter composed model is found to be a saturated four-point uniform LD design with the two boundary points of the design space in the LD design support. For a five-parameter composed model, a sufficient condition for the LD design to require the minimum number of sampling time points is derived. Robust LD designs are also investigated for both models. It is found that an LD design with $k$ parameters is equivalent to an LD design with $k-1$ parameters if the linear parameter in the two composed models is a nuisance parameter. Assorted examples of LD designs are presented.
Title: Adaptive Ridge Selector (ARiS)
Abstract: We introduce a new shrinkage variable selection operator for linear models which we term the (ARiS). This approach is inspired by the (RVM), which uses a Bayesian hierarchical linear setup to do variable selection and model estimation. Extending the RVM algorithm, we include a proper prior distribution for the precisions of the regression coefficients, $v_j^-1 \sim f(v_j^-1|\eta)$, where $\eta$ is a scalar hyperparameter. A novel fitting approach which utilizes the full set of posterior conditional distributions is applied to maximize the joint posterior distribution $p(\boldsymbol\beta,\sigma^2,^-1|,\eta)$ given the value of the hyper-parameter $\eta$. An empirical Bayes method is proposed for choosing $\eta$. This approach is contrasted with other regularized least squares estimators including the lasso, its variants, nonnegative garrote and ordinary ridge regression. Performance differences are explored for various simulated data examples. Results indicate superior prediction and model selection accuracy under sparse setups and drastic improvement in accuracy of model choice with increasing sample size.
Title: Conditioning Probabilistic Databases
Abstract: Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has led researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techniques from constraint satisfaction, such as the Davis-Putnam algorithm. We complement this with a thorough experimental evaluation of the algorithms proposed. Our experiments show that our exact algorithms scale well to realistic database sizes and can in some scenarios compete with the most efficient previous approximation algorithms.
Title: Tableau-based decision procedures for logics of strategic ability in multi-agent systems
Abstract: We develop an incremental tableau-based decision procedures for the Alternating-time temporal logic ATL and some of its variants. While running within the theoretically established complexity upper bound, we claim that our tableau is practically more efficient in the average case than other decision procedures for ATL known so far. Besides, the ease of its adaptation to variants of ATL demonstrates the flexibility of the proposed procedure.
Title: lambda-Connectedness Determination for Image Segmentation
Abstract: Image segmentation is to separate an image into distinct homogeneous regions belonging to different objects. It is an essential step in image analysis and computer vision. This paper compares some segmentation technologies and attempts to find an automated way to better determine the parameters for image segmentation, especially the connectivity value of $\lambda$ in $\lambda$-connected segmentation. Based on the theories on the maximum entropy method and Otsu's minimum variance method, we propose:(1)maximum entropy connectedness determination: a method that uses maximum entropy to determine the best $\lambda$ value in $\lambda$-connected segmentation, and (2) minimum variance connectedness determination: a method that uses the principle of minimum variance to determine $\lambda$ value. Applying these optimization techniques in real images, the experimental results have shown great promise in the development of the new methods. In the end, we extend the above method to more general case in order to compare it with the famous Mumford-Shah method that uses variational principle and geometric measure.
Title: The adjusted Viterbi training for hidden Markov models
Abstract: The EM procedure is a principal tool for parameter estimation in the hidden Markov models. However, applications replace EM by Viterbi extraction, or training (VT). VT is computationally less intensive, more stable and has more of an intuitive appeal, but VT estimation is biased and does not satisfy the following fixed point property. Hypothetically, given an infinitely large sample and initialized to the true parameters, VT will generally move away from the initial values. We propose adjusted Viterbi training (VA), a new method to restore the fixed point property and thus alleviate the overall imprecision of the VT estimators, while preserving the computational advantages of the baseline VT algorithm. Simulations elsewhere have shown that VA appreciably improves the precision of estimation in both the special case of mixture models and more general HMMs. However, being entirely analytic, the VA correction relies on infinite Viterbi alignments and associated limiting probability distributions. While explicit in the mixture case, the existence of these limiting measures is not obvious for more general HMMs. This paper proves that under certain mild conditions, the required limiting distributions for general HMMs do exist.
Title: Quantile Estimation of A general Single-Index Model
Abstract: The single-index model is one of the most popular semiparametric models in Econometrics. In this paper, we define a quantile regression single-index model, which includes the single-index structure for conditional mean and for conditional variance.
Title: Heteroscedastic controlled calibration model applied to analytical chemistry
Abstract: In chemical analysis made by laboratories one has the problem of determining the concentration of a chemical element in a sample. In order to tackle this problem the guide EURACHEM/CITAC recommends the application of the linear calibration model, so implicitly assume that there is no measurement error in the independent variable $X$. In this work, it is proposed a new calibration model assuming that the independent variable is controlled. This assumption is appropriate in chemical analysis where the process tempting to attain the fixed known value $X$ generates an error and the resulting value is $x$, which is not an observable. However, observations on its surrogate $X$ are available. A simulation study is carried out in order to verify some properties of the estimators derived for the new model and it is also considered the usual calibration model to compare it with the new approach. Three applications are considered to verify the performance of the new approach.
Title: KohonAnts: A Self-Organizing Ant Algorithm for Clustering and Pattern Classification
Abstract: In this paper we introduce a new ant-based method that takes advantage of the cooperative self-organization of Ant Colony Systems to create a naturally inspired clustering and pattern recognition method. The approach considers each data item as an ant, which moves inside a grid changing the cells it goes through, in a fashion similar to Kohonen's Self-Organizing Maps. The resulting algorithm is conceptually more simple, takes less free parameters than other ant-based clustering algorithms, and, after some parameter tuning, yields very good results on some benchmark problems.
Title: Using Spatially Varying Pixels Exposures and Bayer-covered Photosensors for High Dynamic Range Imaging
Abstract: The method of a linear high dynamic range imaging using solid-state photosensors with Bayer colour filters array is provided in this paper. Using information from neighbour pixels, it is possible to reconstruct linear images with wide dynamic range from the oversaturated images. Bayer colour filters array is considered as an array of neutral filters in a quasimonochromatic light. If the camera's response function to the desirable light source is known then one can calculate correction coefficients to reconstruct oversaturated images. Reconstructed images are linearized in order to provide a linear high dynamic range images for optical-digital imaging systems. The calibration procedure for obtaining the camera's response function to the desired light source is described. Experimental results of the reconstruction of the images from the oversaturated images are presented for red, green, and blue quasimonochromatic light sources. Quantitative analysis of the accuracy of the reconstructed images is provided.
Title: Figuring out Actors in Text Streams: Using Collocations to establish Incremental Mind-maps
Abstract: The recognition, involvement, and description of main actors influences the story line of the whole text. This is of higher importance as the text per se represents a flow of words and expressions that once it is read it is lost. In this respect, the understanding of a text and moreover on how the actor exactly behaves is not only a major concern: as human beings try to store a given input on short-term memory while associating diverse aspects and actors with incidents, the following approach represents a virtual architecture, where collocations are concerned and taken as the associative completion of the actors' acting. Once that collocations are discovered, they become managed in separated memory blocks broken down by the actors. As for human beings, the memory blocks refer to associative mind-maps. We then present several priority functions to represent the actual temporal situation inside a mind-map to enable the user to reconstruct the recent events from the discovered temporal results.
Title: Extensions of smoothing via taut strings
Abstract: Suppose that we observe independent random pairs $(X_1,Y_1)$, $(X_2,Y_2)$, >..., $(X_n,Y_n)$. Our goal is to estimate regression functions such as the conditional mean or $\beta$--quantile of $Y$ given $X$, where $0<\beta <1$. In order to achieve this we minimize criteria such as, for instance, $$ \sum_i=1^n \rho(f(X_i) - Y_i) + \lambda \cdot \mathop TV\nolimits (f) $$ among all candidate functions $f$. Here $\rho$ is some convex function depending on the particular regression function we have in mind, $\mathop \rm TV\nolimits (f)$ stands for the total variation of $f$, and $\lambda >0$ is some tuning parameter. This framework is extended further to include binary or Poisson regression, and to include localized total variation penalties. The latter are needed to construct estimators adapting to inhomogeneous smoothness of $f$. For the general framework we develop noniterative algorithms for the solution of the minimization problems which are closely related to the taut string algorithm (cf. Davies and Kovac, 2001). Further we establish a connection between the present setting and monotone regression, extending previous work by Mammen and van de Geer (1997). The algorithmic considerations and numerical examples are complemented by two consistency results.
Title: An Indirect Genetic Algorithm for Set Covering Problems
Abstract: This paper presents a new type of genetic algorithm for the set covering problem. It differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function. The genetic algorithm itself provides this decoder with permutations of the solution variables and other parameters. Second, it will be shown that results can be further improved by adding another indirect optimisation layer. The decoder will not directly seek out low cost solutions but instead aims for good exploitable solutions. These are then post optimised by another hill-climbing algorithm. Although seemingly more complicated, we will show that this three-stage approach has advantages in terms of solution quality, speed and adaptability to new types of problems over more direct approaches. Extensive computational results are presented and compared to the latest evolutionary and other heuristic approaches to the same data instances.
Title: On the Application of Hierarchical Coevolutionary Genetic Algorithms: Recombination and Evaluation Partners