text
stringlengths
0
4.09k
Abstract: November 27, 2004, marked the 250th anniversary of the death of Abraham De Moivre, best known in statistical circles for his famous large-sample approximation to the binomial distribution, whose generalization is now referred to as the Central Limit Theorem. De Moivre was one of the great pioneers of classical probability theory. He also made seminal contributions in analytic geometry, complex analysis and the theory of annuities. The first biography of De Moivre, on which almost all subsequent ones have since relied, was written in French by Matthew Maty. It was published in 1755 in the Journal britannique. The authors provide here, for the first time, a complete translation into English of Maty's biography of De Moivre. New material, much of it taken from modern sources, is given in footnotes, along with numerous annotations designed to provide additional clarity to Maty's biography for contemporary readers.
Title: A Conversation with Robert V. Hogg
Abstract: Robert Vincent Hogg was born on November 8, 1924 in Hannibal, Missouri. He earned a Ph.D. in statistics at the University of Iowa in 1950, where his advisor was Allen Craig. Following graduation, he joined the mathematics faculty at the University of Iowa. He was the founding Chair when the Department of Statistics was created at Iowa in 1965 and he served in that capacity for 19 years. At Iowa he also served as Chair of the Quality Management and Productivity Program and the Hanson Chair of Manufacturing Productivity. He became Professor Emeritus in 2001 after 51 years on the Iowa faculty. He is a Fellow of the Institute of Mathematical Statistics and the American Statistical Association plus an Elected Member of the International Statistical Institute. He was President of the American Statistical Association (1988) and chaired two of its winter conferences (1992, 1994). He received the ASA's Founder's Award (1991) and the Gottfried Noether Award (2001) for contributions to nonparametric statistics. His publications through 1996 are described in Communications in Statistics--Theory and Methods (1996), 2467--2481. This interview was conducted on April 14, 2004 at the Department of Statistics, University of Florida, Gainesville, Florida, and revised in the summer of 2006.
Title: Full Bayesian analysis for a class of jump-diffusion models
Abstract: A new Bayesian significance test is adjusted for jump detection in a diffusion process. This is an advantageous procedure for temporal data having extreme valued outliers, like financial data, pluvial or tectonic forces records and others.
Title: Estimation in hidden Markov models via efficient importance sampling
Abstract: Given a sequence of observations from a discrete-time, finite-state hidden Markov model, we would like to estimate the sampling distribution of a statistic. The bootstrap method is employed to approximate the confidence regions of a multi-dimensional parameter. We propose an importance sampling formula for efficient simulation in this context. Our approach consists of constructing a locally asymptotically normal (LAN) family of probability distributions around the default resampling rule and then minimizing the asymptotic variance within the LAN family. The solution of this minimization problem characterizes the asymptotically optimal resampling scheme, which is given by a tilting formula. The implementation of the tilting formula is facilitated by solving a Poisson equation. A few numerical examples are given to demonstrate the efficiency of the proposed importance sampling scheme.
Title: Raising a Hardness Result
Abstract: This article presents a technique for proving problems hard for classes of the polynomial hierarchy or for PSPACE. The rationale of this technique is that some problem restrictions are able to simulate existential or universal quantifiers. If this is the case, reductions from Quantified Boolean Formulae (QBF) to these restrictions can be transformed into reductions from QBFs having one more quantifier in the front. This means that a proof of hardness of a problem at level n in the polynomial hierarchy can be split into n separate proofs, which may be simpler than a proof directly showing a reduction from a class of QBFs to the considered problem.
Title: 2006: Celebrating 75 years of AI - History and Outlook: the Next 25 Years
Abstract: When Kurt Goedel layed the foundations of theoretical computer science in 1931, he also introduced essential concepts of the theory of Artificial Intelligence (AI). Although much of subsequent AI research has focused on heuristics, which still play a major role in many practical AI applications, in the new millennium AI theory has finally become a full-fledged formal science, with important optimality results for embodied agents living in unknown environments, obtained through a combination of theory a la Goedel and probability theory. Here we look back at important milestones of AI history, mention essential recent results, and speculate about what we may expect from the next 25 years, emphasizing the significance of the ongoing dramatic hardware speedups, and discussing Goedel-inspired, self-referential, self-improving universal problem solvers.
Title: Sensitivity Analysis of the Orthoglide, a 3-DOF Translational Parallel Kinematic Machine
Abstract: This paper presents a sensitivity analysis of the Orthoglide, a 3-DOF translational Parallel Kinematic Machine. Two complementary methods are developed to analyze its sensitivity to its dimensional and angular variations. First, a linkage kinematic analysis method is used to have a rough idea of the influence of the dimensional variations on the location of the end-effector. Besides, this method shows that variations in the design parameters of the same type from one leg to the other have the same influence on the end-effector. However, this method does not take into account the variations in the parallelograms. Thus, a differential vector method is used to study the influence of the dimensional and angular variations in the parts of the manipulator on the position and orientation of the end-effector, and particularly the influence of the variations in the parallelograms. It turns out that the kinematic isotropic configuration of the manipulator is the least sensitive one to its dimensional and angular variations. On the contrary, the closest configurations to its kinematic singular configurations are the most sensitive ones to geometrical variations.
Title: Fast estimation of multivariate stochastic volatility
Abstract: In this paper we develop a Bayesian procedure for estimating multivariate stochastic volatility (MSV) using state space models. A multiplicative model based on inverted Wishart and multivariate singular beta distributions is proposed for the evolution of the volatility, and a flexible sequential volatility updating is employed. Being computationally fast, the resulting estimation procedure is particularly suitable for on-line forecasting. Three performance measures are discussed in the context of model selection: the log-likelihood criterion, the mean of standardized one-step forecast errors, and sequential Bayes factors. Finally, the proposed methods are applied to a data set comprising eight exchange rates vis-a-vis the US dollar.
Title: A new method for the estimation of variance matrix with prescribed zeros in nonlinear mixed effects models
Abstract: We propose a new method for the Maximum Likelihood Estimator (MLE) of nonlinear mixed effects models when the variance matrix of Gaussian random effects has a prescribed pattern of zeros (PPZ). The method consists in coupling the recently developed Iterative Conditional Fitting (ICF) algorithm with the Expectation Maximization (EM) algorithm. It provides positive definite estimates for any sample size, and does not rely on any structural assumption on the PPZ. It can be easily adapted to many versions of EM.
Title: On Ultrametric Algorithmic Information
Abstract: How best to quantify the information of an object, whether natural or artifact, is a problem of wide interest. A related problem is the computability of an object. We present practical examples of a new way to address this problem. By giving an appropriate representation to our objects, based on a hierarchical coding of information, we exemplify how it is remarkably easy to compute complex objects. Our algorithmic complexity is related to the length of the class of objects, rather than to the length of the object.
Title: Non-Regular Likelihood Inference for Seasonally Persistent Processes
Abstract: The estimation of parameters in the frequency spectrum of a seasonally persistent stationary stochastic process is addressed. For seasonal persistence associated with a pole in the spectrum located away from frequency zero, a new Whittle-type likelihood is developed that explicitly acknowledges the location of the pole. This Whittle likelihood is a large sample approximation to the distribution of the periodogram over a chosen grid of frequencies, and constitutes an approximation to the time-domain likelihood of the data, via the linear transformation of an inverse discrete Fourier transform combined with a demodulation. The new likelihood is straightforward to compute, and as will be demonstrated has good, yet non-standard, properties. The asymptotic behaviour of the proposed likelihood estimators is studied; in particular, $N$-consistency of the estimator of the spectral pole location is established. Large finite sample and asymptotic distributions of the score and observed Fisher information are given, and the corresponding distributions of the maximum likelihood estimators are deduced. A study of the small sample properties of the likelihood approximation is provided, and its superior performance to previously suggested methods is shown, as well as agreement with the developed distributional approximations.
Title: Effective Generation of Subjectively Random Binary Sequences
Abstract: We present an algorithm for effectively generating binary sequences which would be rated by people as highly likely to have been generated by a random process, such as flipping a fair coin.
Title: Networks of Polynomial Pieces with Application to the Analysis of Point Clouds and Images
Abstract: We consider Holder smoothness classes of surfaces for which we construct piecewise polynomial approximation networks, which are graphs with polynomial pieces as nodes and edges between polynomial pieces that are in `good continuation' of each other. Little known to the community, a similar construction was used by Kolmogorov and Tikhomirov in their proof of their celebrated entropy results for Holder classes. We show how to use such networks in the context of detecting geometric objects buried in noise to approximate the scan statistic, yielding an optimization problem akin to the Traveling Salesman. In the same context, we describe an alternative approach based on computing the longest path in the network after appropriate thresholding. For the special case of curves, we also formalize the notion of `good continuation' between beamlets in any dimension, obtaining more economical piecewise linear approximation networks for curves. We include some numerical experiments illustrating the use of the beamlet network in characterizing the filamentarity content of 3D datasets, and show that even a rudimentary notion of good continuity may bring substantial improvement.
Title: Maximum likelihood estimation of a log-concave density and its distribution function: Basic properties and uniform consistency
Abstract: We study nonparametric maximum likelihood estimation of a log-concave probability density and its distribution and hazard function. Some general properties of these estimators are derived from two characterizations. It is shown that the rate of convergence with respect to supremum norm on a compact interval for the density and hazard rate estimator is at least $(\log(n)/n)^1/3$ and typically $(\log(n)/n)^2/5$, whereas the difference between the empirical and estimated distribution function vanishes with rate $o_(n^-1/2)$ under certain regularity assumptions.
Title: A DH-parameter based condition for 3R orthogonal manipulators to have 4 distinct inverse kinematic solutions
Abstract: Positioning 3R manipulators may have two or four inverse kinematic solutions (IKS). This paper derives a necessary and sufficient condition for 3R positioning manipulators with orthogonal joint axes to have four distinct IKS. We show that the transition between manipulators with 2 and 4 IKS is defined by the set of manipulators with a quadruple root of their inverse kinematics. The resulting condition is explicit and states that the last link length of the manipulator must be greater than a quantity that depends on three of its remaining DH-parameters. This result is of interest for the design of new manipulators.
Title: Filtering Additive Measurement Noise with Maximum Entropy in the Mean
Abstract: The purpose of this note is to show how the method of maximum entropy in the mean (MEM) may be used to improve parametric estimation when the measurements are corrupted by large level of noise. The method is developed in the context on a concrete example: that of estimation of the parameter in an exponential distribution. We compare the performance of our method with the bayesian and maximum likelihood approaches.
Title: Qualitative Belief Conditioning Rules (QBCR)
Abstract: In this paper we extend the new family of (quantitative) Belief Conditioning Rules (BCR) recently developed in the Dezert-Smarandache Theory (DSmT) to their qualitative counterpart for belief revision. Since the revision of quantitative as well as qualitative belief assignment given the occurrence of a new event (the conditioning constraint) can be done in many possible ways, we present here only what we consider as the most appealing Qualitative Belief Conditioning Rules (QBCR) which allow to revise the belief directly with words and linguistic labels and thus avoids the introduction of ad-hoc translations of quantitative beliefs into quantitative ones for solving the problem.
Title: Simple Algorithmic Principles of Discovery, Subjective Beauty, Selective Attention, Curiosity & Creativity
Abstract: I postulate that human or other intelligent agents function or should function as follows. They store all sensory observations as they come - the data is holy. At any time, given some agent's current coding capabilities, part of the data is compressible by a short and hopefully fast program / description / explanation / world model. In the agent's subjective eyes, such data is more regular and more "beautiful" than other data. It is well-known that knowledge of regularity and repeatability may improve the agent's ability to plan actions leading to external rewards. In absence of such rewards, however, known beauty is boring. Then "interestingness" becomes the first derivative of subjective beauty: as the learning agent improves its compression algorithm, formerly apparently random data parts become subjectively more regular and beautiful. Such progress in compressibility is measured and maximized by the curiosity drive: create action sequences that extend the observation history and yield previously unknown / unpredictable but quickly learnable algorithmic regularity. We discuss how all of the above can be naturally implemented on computers, through an extension of passive unsupervised learning to the case of active data selection: we reward a general reinforcement learner (with access to the adaptive compressor) for actions that improve the subjective compressibility of the growing data. An unusually large breakthrough in compressibility deserves the name "discovery". The "creativity" of artists, dancers, musicians, pure mathematicians can be viewed as a by-product of this principle. Several qualitative examples support this hypothesis.
Title: Designing a Virtual Manikin Animation Framework Aimed at Virtual Prototyping
Abstract: In the industry, numerous commercial packages provide tools to introduce, and analyse human behaviour in the product's environment (for maintenance, ergonomics...), thanks to Virtual Humans. We will focus on control. Thanks to algorithms newly introduced in recent research papers, we think we can provide an implementation, which even widens, and simplifies the animation capacities of virtual manikins. In order to do so, we are going to express the industrial expectations as for Virtual Humans, without considering feasibility (not to bias the issue). The second part will show that no commercial application provides the tools that perfectly meet the needs. Thus we propose a new animation framework that better answers the problem. Our contribution is the integration - driven by need of available new scientific techniques to animate Virtual Humans, in a new control scheme that better answers industrial expectations.
Title: The Algebraic Complexity of Maximum Likelihood Estimation for Bivariate Missing Data
Abstract: We study the problem of maximum likelihood estimation for general patterns of bivariate missing data for normal and multinomial random variables, under the assumption that the data is missing at random (MAR). For normal data, the score equations have nine complex solutions, at least one of which is real and statistically significant. Our computations suggest that the number of real solutions is related to whether or not the MAR assumption is satisfied. In the multinomial case, all solutions to the score equations are real and the number of real solutions grows exponentially in the number of states of the underlying random variables, though there is always precisely one statistically significant local maxima.
Title: Counting and Locating the Solutions of Polynomial Systems of Maximum Likelihood Equations, II: The Behrens-Fisher Problem
Abstract: Let $\mu$ be a $p$-dimensional vector, and let $\Sigma_1$ and $\Sigma_2$ be $p \times p$ positive definite covariance matrices. On being given random samples of sizes $N_1$ and $N_2$ from independent multivariate normal populations $N_p(\mu,\Sigma_1)$ and $N_p(\mu,\Sigma_2)$, respectively, the Behrens-Fisher problem is to solve the likelihood equations for estimating the unknown parameters $\mu$, $\Sigma_1$, and $\Sigma_2$. We shall prove that for $N_1, N_2 > p$ there are, almost surely, exactly $2p+1$ complex solutions of the likelihood equations. For the case in which $p = 2$, we utilize Monte Carlo simulation to estimate the relative frequency with which a typical Behrens-Fisher problem has multiple real solutions; we find that multiple real solutions occur infrequently.
Title: Multi-Sensor Fusion Method using Dynamic Bayesian Network for Precise Vehicle Localization and Road Matching
Abstract: This paper presents a multi-sensor fusion strategy for a novel road-matching method designed to support real-time navigational features within advanced driving-assistance systems. Managing multihypotheses is a useful strategy for the road-matching problem. The multi-sensor fusion and multi-modal estimation are realized using Dynamical Bayesian Network. Experimental results, using data from Antilock Braking System (ABS) sensors, a differential Global Positioning System (GPS) receiver and an accurate digital roadmap, illustrate the performances of this approach, especially in ambiguous situations.
Title: Using RDF to Model the Structure and Process of Systems
Abstract: Many systems can be described in terms of networks of discrete elements and their various relationships to one another. A semantic network, or multi-relational network, is a directed labeled graph consisting of a heterogeneous set of entities connected by a heterogeneous set of relationships. Semantic networks serve as a promising general-purpose modeling substrate for complex systems. Various standardized formats and tools are now available to support practical, large-scale semantic network models. First, the Resource Description Framework (RDF) offers a standardized semantic network data model that can be further formalized by ontology modeling languages such as RDF Schema (RDFS) and the Web Ontology Language (OWL). Second, the recent introduction of highly performant triple-stores (i.e. semantic network databases) allows semantic network models on the order of $10^9$ edges to be efficiently stored and manipulated. RDF and its related technologies are currently used extensively in the domains of computer science, digital library science, and the biological sciences. This article will provide an introduction to RDF/RDFS/OWL and an examination of its suitability to model discrete element complex systems.
Title: Belief-Propagation for Weighted b-Matchings on Arbitrary Graphs and its Relation to Linear Programs with Integer Solutions
Abstract: We consider the general problem of finding the minimum weight $\bm$-matching on arbitrary graphs. We prove that, whenever the linear programming (LP) relaxation of the problem has no fractional solutions, then the belief propagation (BP) algorithm converges to the correct solution. We also show that when the LP relaxation has a fractional solution then the BP algorithm can be used to solve the LP relaxation. Our proof is based on the notion of graph covers and extends the analysis of (Bayati-Shah-Sharma 2005 and Huang-Jebara 2007. These results are notable in the following regards: (1) It is one of a very small number of proofs showing correctness of BP without any constraint on the graph structure. (2) Variants of the proof work for both synchronous and asynchronous BP; it is the first proof of convergence and correctness of an asynchronous BP algorithm for a combinatorial optimization problem.
Title: Bayes and empirical Bayes changepoint problems
Abstract: We generalize the approach of Liu and Lawrence (1999) for multiple changepoint problems where the number of changepoints is unknown. The approach is based on dynamic programming recursion for efficient calculation of the marginal probability of the data with the hidden parameters integrated out. For the estimation of the hyperparameters, we propose to use Monte Carlo EM when training data are available. We argue that there is some advantages of using samples from the posterior which takes into account the uncertainty of the changepoints, compared to the traditional MAP estimator, which is also more expensive to compute in this context. The samples from the posterior obtained by our algorithm are independent, getting rid of the convergence issue associated with the MCMC approach. We illustrate our approach on limited simulations and some real data set.
Title: On Universal Prediction and Bayesian Confirmation
Abstract: The Bayesian framework is a well-studied and successful framework for inductive reasoning, which includes hypothesis testing and confirmation, parameter estimation, sequence prediction, classification, and regression. But standard statistical guidelines for choosing the model class and prior are not always available or fail, in particular in complex situations. Solomonoff completed the Bayesian framework by providing a rigorous, unique, formal, and universal choice for the model class and the prior. We discuss in breadth how and in which sense universal (non-i.i.d.) sequence prediction solves various (philosophical) problems of traditional Bayesian sequence prediction. We show that Solomonoff's model possesses many desirable properties: Strong total and weak instantaneous bounds, and in contrast to most classical continuous prior densities has no zero p(oste)rior problem, i.e. can confirm universal hypotheses, is reparametrization and regrouping invariant, and avoids the old-evidence and updating problem. It even performs well (actually better) in non-computable environments.
Title: Bandwidth Selection for Weighted Kernel Density Estimation
Abstract: In the this paper, the authors propose to estimate the density of a targeted population with a weighted kernel density estimator (wKDE) based on a weighted sample. Bandwidth selection for wKDE is discussed. Three mean integrated squared error based bandwidth estimators are introduced and their performance is illustrated via Monte Carlo simulation. The least-squares cross-validation method and the adaptive weight kernel density estimator are also studied. The authors also consider the boundary problem for interval bounded data and apply the new method to a real data set subject to informative censoring.
Title: Uniform Bahadur Representation for Local Polynomial Estimates of M-Regression and Its Application to The Additive Model
Abstract: We use local polynomial fitting to estimate the nonparametric M-regression function for strongly mixing stationary processes $\(Y_i,_i)\$. We establish a strong uniform consistency rate for the Bahadur representation of estimators of the regression function and its derivatives. These results are fundamental for statistical inference and for applications that involve plugging in such estimators into other functionals where some control over higher order terms are required. We apply our results to the estimation of an additive M-regression model.
Title: Solving Constraint Satisfaction Problems through Belief Propagation-guided decimation
Abstract: Message passing algorithms have proved surprisingly successful in solving hard constraint satisfaction problems on sparse random graphs. In such applications, variables are fixed sequentially to satisfy the constraints. Message passing is run after each step. Its outcome provides an heuristic to make choices at next step. This approach has been referred to as `decimation,' with reference to analogous procedures in statistical physics. The behavior of decimation procedures is poorly understood. Here we consider a simple randomized decimation algorithm based on belief propagation (BP), and analyze its behavior on random k-satisfiability formulae. In particular, we propose a tree model for its analysis and we conjecture that it provides asymptotically exact predictions in the limit of large instances. This conjecture is confirmed by numerical simulations.
Title: Efficient Tabling Mechanisms for Transaction Logic Programs
Abstract: In this paper we present efficient evaluation algorithms for the Horn Transaction Logic (a generalization of the regular Horn logic programs with state updates). We present two complementary methods for optimizing the implementation of Transaction Logic. The first method is based on tabling and we modified the proof theory to table calls and answers on states (practically, equivalent to dynamic programming). The call-answer table is indexed on the call and a signature of the state in which the call was made. The answer columns contain the answer unification and a signature of the state after the call was executed. The states are signed efficiently using a technique based on tries and counting. The second method is based on incremental evaluation and it applies when the data oracle contains derived relations. The deletions and insertions (executed in the transaction oracle) change the state of the database. Using the heuristic of inertia (only a part of the state changes in response to elementary updates), most of the time it is cheaper to compute only the changes in the state than to recompute the entire state from scratch. The two methods are complementary by the fact that the first method optimizes the evaluation when a call is repeated in the same state, and the second method optimizes the evaluation of a new state when a call-state pair is not found by the tabling mechanism (i.e. the first method). The proof theory of Transaction Logic with the application of tabling and incremental evaluation is sound and complete with respect to its model theory.
Title: Enrichment of Qualitative Beliefs for Reasoning under Uncertainty
Abstract: This paper deals with enriched qualitative belief functions for reasoning under uncertainty and for combining information expressed in natural language through linguistic labels. In this work, two possible enrichments (quantitative and/or qualitative) of linguistic labels are considered and operators (addition, multiplication, division, etc) for dealing with them are proposed and explained. We denote them $qe$-operators, $qe$ standing for "qualitative-enriched" operators. These operators can be seen as a direct extension of the classical qualitative operators ($q$-operators) proposed recently in the Dezert-Smarandache Theory of plausible and paradoxist reasoning (DSmT). $q$-operators are also justified in details in this paper. The quantitative enrichment of linguistic label is a numerical supporting degree in $[0,\infty)$, while the qualitative enrichment takes its values in a finite ordered set of linguistic values. Quantitative enrichment is less precise than qualitative enrichment, but it is expected more close with what human experts can easily provide when expressing linguistic labels with supporting degrees. Two simple examples are given to show how the fusion of qualitative-enriched belief assignments can be done.
Title: Parallel marginalization Monte Carlo with applications to conditional path sampling
Abstract: Monte Carlo sampling methods often suffer from long correlation times. Consequently, these methods must be run for many steps to generate an independent sample. In this paper a method is proposed to overcome this difficulty. The method utilizes information from rapidly equilibrating coarse Markov chains that sample marginal distributions of the full system. This is accomplished through exchanges between the full chain and the auxiliary coarse chains. Results of numerical tests on the bridge sampling and filtering/smoothing problems for a stochastic differential equation are presented.
Title: Experiments with small helicopter automated landings at unusual attitudes
Abstract: This paper describes a set of experiments involving small helicopters landing automated landing at unusual attitudes. By leveraging the increased agility of small air vehicles, we show that it is possible to automatically land a small helicopter on surfaces pitched at angles up to 60 degrees. Such maneuvers require considerable agility from the vehicle and its avionics system, and they pose significant technical and safety challenges. Our work builds upon previous activities in human-inspired, high-agility flight for small rotorcraft. However, it was not possible to leverage manual flight test data to extract landing maneuvers due to stringent attitude and position control requirements. Availability of low-cost, local navigation systems requiring no on-board instrumentation has proven particularly important for these experiments to be successful.
Title: Variational local structure estimation for image super-resolution
Abstract: Super-resolution is an important but difficult problem in image/video processing. If a video sequence or some training set other than the given low-resolution image is available, this kind of extra information can greatly aid in the reconstruction of the high-resolution image. The problem is substantially more difficult with only a single low-resolution image on hand. The image reconstruction methods designed primarily for denoising is insufficient for super-resolution problem in the sense that it tends to oversmooth images with essentially no noise. We propose a new adaptive linear interpolation method based on variational method and inspired by local linear embedding (LLE). The experimental result shows that our method avoids the problem of oversmoothing and preserves image structures well.