text
stringlengths 0
4.09k
|
|---|
Abstract: Constrained Optimum Path (COP) problems appear in many real-life applications, especially on communication networks. Some of these problems have been considered and solved by specific techniques which are usually difficult to extend. In this paper, we introduce a novel local search modeling for solving some COPs by local search. The modeling features the compositionality, modularity, reuse and strengthens the benefits of Constrained-Based Local Search. We also apply the modeling to the edge-disjoint paths problem (EDP). We show that side constraints can easily be added in the model. Computational results show the significance of the approach.
|
Title: Dynamic Demand-Capacity Balancing for Air Traffic Management Using Constraint-Based Local Search: First Results
|
Abstract: Using constraint-based local search, we effectively model and efficiently solve the problem of balancing the traffic demands on portions of the European airspace while ensuring that their capacity constraints are satisfied. The traffic demand of a portion of airspace is the hourly number of flights planned to enter it, and its capacity is the upper bound on this number under which air-traffic controllers can work. Currently, the only form of demand-capacity balancing we allow is ground holding, that is the changing of the take-off times of not yet airborne flights. Experiments with projected European flight plans of the year 2030 show that already this first form of demand-capacity balancing is feasible without incurring too much total delay and that it can lead to a significantly better demand-capacity balance.
|
Title: On Improving Local Search for Unsatisfiability
|
Abstract: Stochastic local search (SLS) has been an active field of research in the last few years, with new techniques and procedures being developed at an astonishing rate. SLS has been traditionally associated with satisfiability solving, that is, finding a solution for a given problem instance, as its intrinsic nature does not address unsatisfiable problems. Unsatisfiable instances were therefore commonly solved using backtrack search solvers. For this reason, in the late 90s Selman, Kautz and McAllester proposed a challenge to use local search instead to prove unsatisfiability. More recently, two SLS solvers - Ranger and Gunsat - have been developed, which are able to prove unsatisfiability albeit being SLS solvers. In this paper, we first compare Ranger with Gunsat and then propose to improve Ranger performance using some of Gunsat's techniques, namely unit propagation look-ahead and extended resolution.
|
Title: Integrating Conflict Driven Clause Learning to Local Search
|
Abstract: This article introduces SatHyS (SAT HYbrid Solver), a novel hybrid approach for propositional satisfiability. It combines local search and conflict driven clause learning (CDCL) scheme. Each time the local search part reaches a local minimum, the CDCL is launched. For SAT problems it behaves like a tabu list, whereas for UNSAT ones, the CDCL part tries to focus on minimum unsatisfiable sub-formula (MUS). Experimental results show good performances on many classes of SAT instances from the last SAT competitions.
|
Title: A Constraint-directed Local Search Approach to Nurse Rostering Problems
|
Abstract: In this paper, we investigate the hybridization of constraint programming and local search techniques within a large neighbourhood search scheme for solving highly constrained nurse rostering problems. As identified by the research, a crucial part of the large neighbourhood search is the selection of the fragment (neighbourhood, i.e. the set of variables), to be relaxed and re-optimized iteratively. The success of the large neighbourhood search depends on the adequacy of this identified neighbourhood with regard to the problematic part of the solution assignment and the choice of the neighbourhood size. We investigate three strategies to choose the fragment of different sizes within the large neighbourhood search scheme. The first two strategies are tailored concerning the problem properties. The third strategy is more general, using the information of the cost from the soft constraint violations and their propagation as the indicator to choose the variables added into the fragment. The three strategies are analyzed and compared upon a benchmark nurse rostering problem. Promising results demonstrate the possibility of future work in the hybrid approach.
|
Title: Sonet Network Design Problems
|
Abstract: This paper presents a new method and a constraint-based objective function to solve two problems related to the design of optical telecommunication networks, namely the Synchronous Optical Network Ring Assignment Problem (SRAP) and the Intra-ring Synchronous Optical Network Design Problem (IDP). These network topology problems can be represented as a graph partitioning with capacity constraints as shown in previous works. We present here a new objective function and a new local search algorithm to solve these problems. Experiments conducted in Comet allow us to compare our method to previous ones and show that we obtain better results.
|
Title: Parallel local search for solving Constraint Problems on the Cell Broadband Engine (Preliminary Results)
|
Abstract: We explore the use of the Cell Broadband Engine (Cell/BE for short) for combinatorial optimization applications: we present a parallel version of a constraint-based local search algorithm that has been implemented on a multiprocessor BladeCenter machine with twin Cell/BE processors (total of 16 SPUs per blade). This algorithm was chosen because it fits very well the Cell/BE architecture and requires neither shared memory nor communication between processors, while retaining a compact memory footprint. We study the performance on several large optimization benchmarks and show that this achieves mostly linear time speedups, even sometimes super-linear. This is possible because the parallel implementation might explore simultaneously different parts of the search space and therefore converge faster towards the best sub-space and thus towards a solution. Besides getting speedups, the resulting times exhibit a much smaller variance, which benefits applications where a timely reply is critical.
|
Title: Toward an automaton Constraint for Local Search
|
Abstract: We explore the idea of using finite automata to implement new constraints for local search (this is already a successful technique in constraint-based global search). We show how it is possible to maintain incrementally the violations of a constraint and its decision variables from an automaton that describes a ground checker for that constraint. We establish the practicality of our approach idea on real-life personnel rostering problems, and show that it is competitive with the approach of [Pralong, 2007].
|
Title: Adaboost with "Keypoint Presence Features" for Real-Time Vehicle Visual Detection
|
Abstract: We present promising results for real-time vehicle visual detection, obtained with adaBoost using new original ?keypoints presence features?. These weak-classifiers produce a boolean response based on presence or absence in the tested image of a ?keypoint? ( a SURF interest point) with a descriptor sufficiently similar (i.e. within a given distance) to a reference descriptor characterizing the feature. A first experiment was conducted on a public image dataset containing lateral-viewed cars, yielding 95% recall with 95% precision on test set. Moreover, analysis of the positions of adaBoost-selected keypoints show that they correspond to a specific part of the object category (such as ?wheel? or ?side skirt?) and thus have a ?semantic? meaning.
|
Title: Introducing New AdaBoost Features for Real-Time Vehicle Detection
|
Abstract: This paper shows how to improve the real-time object detection in complex robotics applications, by exploring new visual features as AdaBoost weak classifiers. These new features are symmetric Haar filters (enforcing global horizontal and vertical symmetry) and N-connexity control points. Experimental evaluation on a car database show that the latter appear to provide the best results for the vehicle-detection problem.
|
Title: Visual object categorization with new keypoint-based adaBoost features
|
Abstract: We present promising results for visual object categorization, obtained with adaBoost using new original ?keypoints-based features?. These weak-classifiers produce a boolean response based on presence or absence in the tested image of a ?keypoint? (a kind of SURF interest point) with a descriptor sufficiently similar (i.e. within a given distance) to a reference descriptor characterizing the feature. A first experiment was conducted on a public image dataset containing lateral-viewed cars, yielding 95% recall with 95% precision on test set. Preliminary tests on a small subset of a pedestrians database also gives promising 97% recall with 92 % precision, which shows the generality of our new family of features. Moreover, analysis of the positions of adaBoost-selected keypoints show that they correspond to a specific part of the object category (such as ?wheel? or ?side skirt? in the case of lateral-cars) and thus have a ?semantic? meaning. We also made a first test on video for detecting vehicles from adaBoostselected keypoints filtered in real-time from all detected keypoints.
|
Title: Modular Traffic Sign Recognition applied to on-vehicle real-time visual detection of American and European speed limit signs
|
Abstract: We present a new modular traffic signs recognition system, successfully applied to both American and European speed limit signs. Our sign detection step is based only on shape-detection (rectangles or circles). This enables it to work on grayscale images, contrary to most European competitors, which eases robustness to illumination conditions (notably night operation). Speed sign candidates are classified (or rejected) by segmenting potential digits inside them (which is rather original and has several advantages), and then applying a neural digit recognition. The global detection rate is 90% for both (standard) U.S. and E.U. speed signs, with a misclassification rate <1%, and no validated false alarm in >150 minutes of video. The system processes in real-time 20 frames/s on a standard high-end laptop.
|
Title: Scatter and regularity imply Benford's law... and more
|
Abstract: A random variable (r.v.) X is said to follow Benford's law if log(X) is uniform mod 1. Many experimental data sets prove to follow an approximate version of it, and so do many mathematical series and continuous random variables. This phenomenon received some interest, and several explanations have been put forward. Most of them focus on specific data, depending on strong assumptions, often linked with the log function. Some authors hinted - implicitly - that the two most important characteristics of a random variable when it comes to Benford are regularity and scatter. In a first part, we prove two theorems, making up a formal version of this intuition: scattered and regular r.v.'s do approximately follow Benford's law. The proofs only need simple mathematical tools, making the analysis easy. Previous explanations thus become corollaries of a more general and simpler one. These results suggest that Benford's law does not depend on properties linked with the log function. We thus propose and test a general version of the Benford's law. The success of these tests may be viewed as an a posteriori validation of the analysis formulated in the first part.
|
Title: Proceedings 6th International Workshop on Local Search Techniques in Constraint Satisfaction
|
Abstract: LSCS is a satellite workshop of the international conference on principles and practice of Constraint Programming (CP), since 2004. It is devoted to local search techniques in constraint satisfaction, and focuses on all aspects of local search techniques, including: design and implementation of new algorithms, hybrid stochastic-systematic search, reactive search optimization, adaptive search, modeling for local-search, global constraints, flexibility and robustness, learning methods, and specific applications.
|
Title: Tracking object's type changes with fuzzy based fusion rule
|
Abstract: In this paper the behavior of three combinational rules for temporal/sequential attribute data fusion for target type estimation are analyzed. The comparative analysis is based on: Dempster's fusion rule proposed in Dempster-Shafer Theory; Proportional Conflict Redistribution rule no. 5 (PCR5), proposed in Dezert-Smarandache Theory and one alternative class fusion rule, connecting the combination rules for information fusion with particular fuzzy operators, focusing on the t-norm based Conjunctive rule as an analog of the ordinary conjunctive rule and t-conorm based Disjunctive rule as an analog of the ordinary disjunctive rule. The way how different t-conorms and t-norms functions within TCN fusion rule influence over target type estimation performance is studied and estimated.
|
Title: On resolving the Savage-Dickey paradox
|
Abstract: The Savage-Dickey ratio is known as a specialised representation of the Bayes factor (O'Hagan and Forster, 2004) that allows for a functional plugging approximation of this quantity. We demonstrate here that the Savage-Dickey representation is in fact a generic representation of the Bayes factor that relies on specific measure-theoretic versions of the densities involved in the ratio, instead of a special identity imposing the above constraints on the prior distributions. We completely clarify the measure-theoretic foundations of the representation as well as the generalisation of Verdinelli and Wasserman (1995) and propose a comparison of this new approximation with their version, as well as with bridge sampling and Chib's approaches.
|
Title: Moment analysis of the Delaunay tessellation field estimator
|
Abstract: The Campbell--Mecke theorem is used to derive explicit expressions for the mean and variance of Schaap and Van de Weygaert's Delaunay tessellation field estimator. Special attention is paid to Poisson processes.
|
Title: Ludics and its Applications to natural Language Semantics
|
Abstract: Proofs, in Ludics, have an interpretation provided by their counter-proofs, that is the objects they interact with. We follow the same idea by proposing that sentence meanings are given by the counter-meanings they are opposed to in a dialectical interaction. The conception is at the intersection of a proof-theoretic and a game-theoretic accounts of semantics, but it enlarges them by allowing to deal with possibly infinite processes.
|
Title: Local and global approaches of affinity propagation clustering for large scale data
|
Abstract: Recently a new clustering algorithm called 'affinity propagation' (AP) has been proposed, which efficiently clustered sparsely related data by passing messages between data points. However, we want to cluster large scale data where the similarities are not sparse in many cases. This paper presents two variants of AP for grouping large scale data with a dense similarity matrix. The local approach is partition affinity propagation (PAP) and the global method is landmark affinity propagation (LAP). PAP passes messages in the subsets of data first and then merges them as the number of initial step of iterations; it can effectively reduce the number of iterations of clustering. LAP passes messages between the landmark data points first and then clusters non-landmark data points; it is a large global approximation method to speed up clustering. Experiments are conducted on many datasets, such as random data points, manifold subspaces, images of faces and Chinese calligraphy, and the results demonstrate that the two approaches are feasible and practicable.
|
Title: Decomposition of forging die for high speed machining
|
Abstract: Today's forging die manufacturing process must be adapted to several evolutions in machining process generation: CAD/CAM models, CAM software solutions and High Speed Machining (HSM). In this context, the adequacy between die shape and HSM process is in the core of machining preparation and process planning approaches. This paper deals with an original approach of machining preparation integrating this adequacy in the main tasks carried out. In this approach, the design of the machining process is based on two levels of decomposition of the geometrical model of a given die with respect to HSM cutting conditions (cutting speed and feed rate) and technological constrains (tool selection, features accessibility). This decomposition assists machining assistant to generate an HSM process. The result of this decomposition is the identification of machining features.
|
Title: Circular tests for HSM machine tools: Bore machining application
|
Abstract: Today's High-Speed Machining (HSM) machine tool combines productivity and part quality. The difficulty inherent in HSM operations lies in understanding the impact of machine tool behaviour on machining time and part quality. Analysis of some of the relevant ISO standards (230-1998, 10791-1998) and a complementary protocol for better understanding HSM technology are presented in the first part of this paper. These ISO standards are devoted to the procedures implemented in order to study the behavior of machine tool. As these procedures do not integrate HSM technology, the need for HSM machine tool tests becomes critical to improving the trade-off between machining time and part quality. A new protocol for analysing the HSM technology impact during circular interpolation is presented in the second part of the paper. This protocol which allows evaluating kinematic machine tool behaviour during circular interpolation was designed from tests without machining. These tests are discussed and their results analysed in the paper. During the circular interpolation, axis capacities (such as acceleration or Jerk) related to certain setting parameters of the numerical control unit have a significant impact on the value of the feed rate. Consequently, a kinematic model for a circular-interpolated trajectory was developed on the basis of these parameters. Moreover, the link between part accuracy and kinematic machine tool behaviour was established. The kinematic model was ultimately validated on a bore machining simulation.
|
Title: Machining strategy choice: performance VIEWER
|
Abstract: Nowadays high speed machining (HSM) machine tool combines productivity and part quality. So mould and die maker invested in HSM. Die and mould features are more and more complex shaped. Thus, it is difficult to choose the best machining strategy according to part shape. Geometrical analysis of machining features is not sufficient to make an optimal choice. Some research show that security, technical, functional and economical constrains must be taken into account to elaborate a machining strategy. During complex shape machining, production system limits induce feed rate decreases, thus loss of productivity, in some part areas. In this paper we propose to analyse these areas by estimating tool path quality. First we perform experiments on HSM machine tool to determine trajectory impact on machine tool behaviour. Then, we extract critical criteria and establish models of performance loss. Our work is focused on machine tool kinematical performance and numerical controller unit calculation capacity. We implement these models on Esprit CAM Software. During machining trajectory creation, critical part areas can be visualised and analysed. Parameters, such as, segment or arc lengths, nature of discontinuities encountered are used to analyse critical part areas. According to this visualisation, process development engineer should validate or modify the trajectory.
|
Title: Decomposition of forging dies for machining planning
|
Abstract: This paper will provide a method to decompose forging dies for machining planning in the case of high speed machining finishing operations. This method lies on a machining feature approach model presented in the following paper. The two main decomposition phases, called Basic Machining Features Extraction and Process Planning Generation, are presented. These two decomposition phases integrates machining resources models and expert machining knowledge to provide an outstanding process planning.
|
Title: D\'efinition d'une pi\`ece test pour la caract\'erisation d'une machine UGV
|
Abstract: In several fields like aeronautics, die and automotive, the machining of the parts is done more and more on high speed machines tools. Today, the offer for purchasing these machine tools is very wide. This situation poses the problem of the judicious and objective choice meeting industrial needs that must be necessary well expressed. The choice remains difficult insofar as the technical data provided to the customers by the manufacturers of machine tools are insufficient as well quantitatively as qualitatively. In this paper we present a protocol for the characterization of machines tools in order to direct the choice. The protocol is based on the one hand on no-load complementary tests to those recommended by the standards ISO 230 and ISO 10791 and on the other hand on the tests in load on a part test. In the first part, we present the industrial needs as well as an analysis of the technical data of machine tools. The second part is devoted to the study of the standards, the description of the protocol and the presentation of the results.
|
Title: Scaling Analysis of Affinity Propagation
|
Abstract: We analyze and exploit some scaling properties of the Affinity Propagation (AP) clustering algorithm proposed by Frey and Dueck (2007). First we observe that a divide and conquer strategy, used on a large data set hierarchically reduces the complexity $\cal O(N^2)$ to $\cal O(N^(h+2)/(h+1))$, for a data-set of size $N$ and a depth $h$ of the hierarchical strategy. For a data-set embedded in a $d$-dimensional space, we show that this is obtained without notably damaging the precision except in dimension $d=2$. In fact, for $d$ larger than 2 the relative loss in precision scales like $N^(2-d)/(h+1)d$. Finally, under some conditions we observe that there is a value $s^*$ of the penalty coefficient, a free parameter used to fix the number of clusters, which separates a fragmentation phase (for $s<s^*$) from a coalescent one (for $s>s^*$) of the underlying hidden cluster structure. At this precise point holds a self-similarity property which can be exploited by the hierarchical strategy to actually locate its position. From this observation, a strategy based on \AP can be defined to find out how many clusters are present in a given dataset.
|
Title: Importance Sampling for rare events and conditioned random walks
|
Abstract: This paper introduces a new Importance Sampling scheme, called Adaptive Twisted Importance Sampling, which is adequate for the improved estimation of rare event probabilities in he range of moderate deviations pertaining to the empirical mean of real i.i.d. summands. It is based on a sharp approximation of the density of long runs extracted from a random walk conditioned on its end value.
|
Title: 3D/2D Registration of Mapping Catheter Images for Arrhythmia Interventional Assistance
|
Abstract: Radiofrequency (RF) catheter ablation has transformed treatment for tachyarrhythmias and has become first-line therapy for some tachycardias. The precise localization of the arrhythmogenic site and the positioning of the RF catheter over that site are problematic: they can impair the efficiency of the procedure and are time consuming (several hours). Electroanatomic mapping technologies are available that enable the display of the cardiac chambers and the relative position of ablation lesions. However, these are expensive and use custom-made catheters. The proposed methodology makes use of standard catheters and inexpensive technology in order to create a 3D volume of the heart chamber affected by the arrhythmia. Further, we propose a novel method that uses a priori 3D information of the mapping catheter in order to estimate the 3D locations of multiple electrodes across single view C-arm images. The monoplane algorithm is tested for feasibility on computer simulations and initial canine data.
|
Title: Color Image Clustering using Block Truncation Algorithm
|
Abstract: With the advancement in image capturing device, the image data been generated at high volume. If images are analyzed properly, they can reveal useful information to the human users. Content based image retrieval address the problem of retrieving images relevant to the user needs from image databases on the basis of low-level visual features that can be derived from the images. Grouping images into meaningful categories to reveal useful information is a challenging and important problem. Clustering is a data mining technique to group a set of unsupervised data based on the conceptual clustering principal: maximizing the intraclass similarity and minimizing the interclass similarity. Proposed framework focuses on color as feature. Color Moment and Block Truncation Coding (BTC) are used to extract features for image dataset. Experimental study using K-Means clustering algorithm is conducted to group the image dataset into various clusters.
|
Title: Distributed Object Medical Imaging Model
|
Abstract: Digital medical informatics and images are commonly used in hospitals today,. Because of the interrelatedness of the radiology department and other departments, especially the intensive care unit and emergency department, the transmission and sharing of medical images has become a critical issue. Our research group has developed a Java-based Distributed Object Medical Imaging Model(DOMIM) to facilitate the rapid development and deployment of medical imaging applications in a distributed environment that can be shared and used by related departments and mobile physiciansDOMIM is a unique suite of multimedia telemedicine applications developed for the use by medical related organizations. The applications support realtime patients' data, image files, audio and video diagnosis annotation exchanges. The DOMIM enables joint collaboration between radiologists and physicians while they are at distant geographical locations. The DOMIM environment consists of heterogeneous, autonomous, and legacy resources. The Common Object Request Broker Architecture (CORBA), Java Database Connectivity (JDBC), and Java language provide the capability to combine the DOMIM resources into an integrated, interoperable, and scalable system. The underneath technology, including IDL ORB, Event Service, IIOP JDBC/ODBC, legacy system wrapping and Java implementation are explored. This paper explores a distributed collaborative CORBA/JDBC based framework that will enhance medical information management requirements and development. It encompasses a new paradigm for the delivery of health services that requires process reengineering, cultural changes, as well as organizational changes
|
Title: Evaluation of Hindi to Punjabi Machine Translation System
|
Abstract: Machine Translation in India is relatively young. The earliest efforts date from the late 80s and early 90s. The success of every system is judged from its evaluation experimental results. Number of machine translation systems has been started for development but to the best of author knowledge, no high quality system has been completed which can be used in real applications. Recently, Punjabi University, Patiala, India has developed Punjabi to Hindi Machine translation system with high accuracy of about 92%. Both the systems i.e. system under question and developed system are between same closely related languages. Thus, this paper presents the evaluation results of Hindi to Punjabi machine translation system. It makes sense to use same evaluation criteria as that of Punjabi to Hindi Punjabi Machine Translation System. After evaluation, the accuracy of the system is found to be about 95%.
|
Title: Microstructure reconstruction using entropic descriptors
|
Abstract: A multi-scale approach to the inverse reconstruction of a pattern's microstructure is reported. Instead of a correlation function, a pair of entropic descriptors (EDs) is proposed for stochastic optimization method. The first of them measures a spatial inhomogeneity, for a binary pattern, or compositional one, for a greyscale image. The second one quantifies a spatial or compositional statistical complexity. The EDs reveal structural information that is dissimilar, at least in part, to that given by correlation functions at almost all of discrete length scales. The method is tested on a few digitized binary and greyscale images. In each of the cases, the persuasive reconstruction of the microstructure is found.
|
Title: Strategies for online inference of model-based clustering in large and growing networks
|
Abstract: In this paper we adapt online estimation strategies to perform model-based clustering on large networks. Our work focuses on two algorithms, the first based on the SAEM algorithm, and the second on variational methods. These two strategies are compared with existing approaches on simulated and real data. We use the method to decipher the connexion structure of the political websphere during the US political campaign in 2008. We show that our online EM-based algorithms offer a good trade-off between precision and speed, when estimating parameters for mixture distributions in the context of random graphs.
|
Title: Higher coordination with less control - A result of information maximization in the sensorimotor loop
|
Abstract: This work presents a novel learning method in the context of embodied artificial intelligence and self-organization, which has as few assumptions and restrictions as possible about the world and the underlying model. The learning rule is derived from the principle of maximizing the predictive information in the sensorimotor loop. It is evaluated on robot chains of varying length with individually controlled, non-communicating segments. The comparison of the results shows that maximizing the predictive information per wheel leads to a higher coordinated behavior of the physically connected robots compared to a maximization per robot. Another focus of this paper is the analysis of the effect of the robot chain length on the overall behavior of the robots. It will be shown that longer chains with less capable controllers outperform those of shorter length and more complex controllers. The reason is found and discussed in the information-geometric interpretation of the learning process.
|
Title: Distributed Learning in Multi-Armed Bandit with Multiple Players
|
Abstract: We formulate and study a decentralized multi-armed bandit (MAB) problem. There are M distributed players competing for N independent arms. Each arm, when played, offers i.i.d. reward according to a distribution with an unknown parameter. At each time, each player chooses one arm to play without exchanging observations or any information with other players. Players choosing the same arm collide, and, depending on the collision model, either no one receives reward or the colliding players share the reward in an arbitrary way. We show that the minimum system regret of the decentralized MAB grows with time at the same logarithmic order as in the centralized counterpart where players act collectively as a single entity by exchanging observations and making decisions jointly. A decentralized policy is constructed to achieve this optimal order while ensuring fairness among players and without assuming any pre-agreement or information exchange among players. Based on a Time Division Fair Sharing (TDFS) of the M best arms, the proposed policy is constructed and its order optimality is proven under a general reward model. Furthermore, the basic structure of the TDFS policy can be used with any order-optimal single-player policy to achieve order optimality in the decentralized setting. We also establish a lower bound on the system regret growth rate for a general class of decentralized polices, to which the proposed policy belongs. This problem finds potential applications in cognitive radio networks, multi-channel communication systems, multi-agent systems, web search and advertising, and social networks.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.