text
stringlengths
0
4.09k
Title: Web-Based Expert System for Civil Service Regulations: RCSES
Abstract: Internet and expert systems have offered new ways of sharing and distributing knowledge, but there is a lack of researches in the area of web based expert systems. This paper introduces a development of a web-based expert system for the regulations of civil service in the Kingdom of Saudi Arabia named as RCSES. It is the first time to develop such system (application of civil service regulations) as well the development of it using web based approach. The proposed system considers 17 regulations of the civil service system. The different phases of developing the RCSES system are presented, as knowledge acquiring and selection, ontology and knowledge representations using XML format. XML Rule-based knowledge sources and the inference mechanisms were implemented using ASP.net technique. An interactive tool for entering the ontology and knowledge base, and the inferencing was built. It gives the ability to use, modify, update, and extend the existing knowledge base in an easy way. The knowledge was validated by experts in the domain of civil service regulations, and the proposed RCSES was tested, verified, and validated by different technical users and the developers staff. The RCSES system is compared with other related web based expert systems, that comparison proved the goodness, usability, and high performance of RCSES.
Title: A Binary Control Chart to Detect Small Jumps
Abstract: The classic N p chart gives a signal if the number of successes in a sequence of inde- pendent binary variables exceeds a control limit. Motivated by engineering applications in industrial image processing and, to some extent, financial statistics, we study a simple modification of this chart, which uses only the most recent observations. Our aim is to construct a control chart for detecting a shift of an unknown size, allowing for an unknown distribution of the error terms. Simulation studies indicate that the proposed chart is su- perior in terms of out-of-control average run length, when one is interest in the detection of very small shifts. We provide a (functional) central limit theorem under a change-point model with local alternatives which explains that unexpected and interesting behavior. Since real observations are often not independent, the question arises whether these re- sults still hold true for the dependent case. Indeed, our asymptotic results work under the fairly general condition that the observations form a martingale difference array. This enlarges the applicability of our results considerably, firstly, to a large class time series models, and, secondly, to locally dependent image data, as we demonstrate by an example.
Title: Sequentially Updated Residuals and Detection of Stationary Errors in Polynomial Regression Models
Abstract: The question whether a time series behaves as a random walk or as a station- ary process is an important and delicate problem, particularly arising in financial statistics, econometrics, and engineering. This paper studies the problem to detect sequentially that the error terms in a polynomial regression model no longer behave as a random walk but as a stationary process. We provide the asymptotic distribution theory for a monitoring procedure given by a control chart, i.e., a stopping time, which is related to a well known unit root test statistic calculated from sequentially updated residuals. We provide a functional central limit theorem for the corresponding stochastic process which implies a central limit theorem for the control chart. The finite sample properties are investigated by a simulation study.
Title: Cheating for Problem Solving: A Genetic Algorithm with Social Interactions
Abstract: We propose a variation of the standard genetic algorithm that incorporates social interaction between the individuals in the population. Our goal is to understand the evolutionary role of social systems and its possible application as a non-genetic new step in evolutionary algorithms. In biological populations, ie animals, even human beings and microorganisms, social interactions often affect the fitness of individuals. It is conceivable that the perturbation of the fitness via social interactions is an evolutionary strategy to avoid trapping into local optimum, thus avoiding a fast convergence of the population. We model the social interactions according to Game Theory. The population is, therefore, composed by cooperator and defector individuals whose interactions produce payoffs according to well known game models (prisoner's dilemma, chicken game, and others). Our results on Knapsack problems show, for some game models, a significant performance improvement as compared to a standard genetic algorithm.
Title: A New Method to Extract Dorsal Hand Vein Pattern using Quadratic Inference Function
Abstract: Among all biometric, dorsal hand vein pattern is attracting the attention of researchers, of late. Extensive research is being carried out on various techniques in the hope of finding an efficient one which can be applied on dorsal hand vein pattern to improve its accuracy and matching time. One of the crucial step in biometric is the extraction of features. In this paper, we propose a method based on quadratic inference function to the dorsal hand vein features to extract its features. The biometric system developed was tested on a database of 100 images. The false acceptance rate (FAR), false rejection rate (FRR) and the matching time are being computed.
Title: A Topological derivative based image segmentation for sign language recognition system using isotropic filter
Abstract: The need of sign language is increasing radically especially to hearing impaired community. Only few research groups try to automatically recognize sign language from video, colored gloves and etc. Their approach requires a valid segmentation of the data that is used for training and of the data that is used to be recognized. Recognition of a sign language image sequence is challenging because of the variety of hand shapes and hand motions. Here, this paper proposes to apply a combination of image segmentation with restoration using topological derivatives for achieving high recognition accuracy. Image quality measures are conceded here to differentiate the methods both subjectively as well as objectively. Experiments show that the additional use of the restoration before segmenting the postures significantly improves the correct rate of hand detection, and that the discrete derivatives yields a high rate of discrimination between different static hand postures as well as between hand postures and the scene background. Eventually, the research is to contribute to the implementation of automated sign language recognition system mainly established for the welfare purpose.
Title: A New Image Steganography Based On First Component Alteration Technique
Abstract: In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image) which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.
Title: ICD 10 Based Medical Expert System Using Fuzzy Temporal Logic
Abstract: Medical diagnosis process involves many levels and considerable amount of time and money are invariably spent for the first level of diagnosis usually made by the physician for all the patients every time. Hence there is a need for a computer based system which not only asks relevant questions to the patients but also aids the physician by giving a set of possible diseases from the symptoms obtained using logic at inference. In this work, an ICD10 based Medical Expert System that provides advice, information and recommendation to the physician using fuzzy temporal logic. The knowledge base used in this system consists of facts of symptoms and rules on diseases. It also provides fuzzy severity scale and weight factor for symptom and disease and can vary with respect to time. The system generates the possible disease conditions based on modified Euclidean metric using Elders algorithm for effective clustering. The minimum similarity value is used as the decision parameter to identify a disease.
Title: An Improved Image Mining Technique For Brain Tumour Classification Using Efficient Classifier
Abstract: An improved image mining technique for brain tumor classification using pruned association rule with MARI algorithm is presented in this paper. The method proposed makes use of association rule mining technique to classify the CT scan brain images into three categories namely normal, benign and malign. It combines the low level features extracted from images and high level knowledge from specialists. The developed algorithm can assist the physicians for efficient classification with multiple keywords per image to improve the accuracy. The experimental result on prediagnosed database of brain images showed 96 percent and 93 percent sensitivity and accuracy respectively.
Title: Reversible jump Markov chain Monte Carlo and multi-model samplers
Abstract: To appear in the second edition of the MCMC handbook, S. P. Brooks, A. Gelman, G. Jones and X.-L. Meng (eds), Chapman & Hall.
Title: Likelihood-free Markov chain Monte Carlo
Abstract: To appear to MCMC handbook, S. P. Brooks, A. Gelman, G. Jones and X.-L. Meng (eds), Chapman & Hall.
Title: An alternative marginal likelihood estimator for phylogenetic models
Abstract: Bayesian phylogenetic methods are generating noticeable enthusiasm in the field of molecular systematics. Many phylogenetic models are often at stake and different approaches are used to compare them within a Bayesian framework. The Bayes factor, defined as the ratio of the marginal likelihoods of two competing models, plays a key role in Bayesian model selection. We focus on an alternative estimator of the marginal likelihood whose computation is still a challenging problem. Several computational solutions have been proposed none of which can be considered outperforming the others simultaneously in terms of simplicity of implementation, computational burden and precision of the estimates. Practitioners and researchers, often led by available software, have privileged so far the simplicity of the harmonic mean estimator (HM) and the arithmetic mean estimator (AM). However it is known that the resulting estimates of the Bayesian evidence in favor of one model are biased and often inaccurate up to having an infinite variance so that the reliability of the corresponding conclusions is doubtful. Our new implementation of the generalized harmonic mean (GHM) idea recycles MCMC simulations from the posterior, shares the computational simplicity of the original HM estimator, but, unlike it, overcomes the infinite variance issue. The alternative estimator is applied to simulated phylogenetic data and produces fully satisfactory results outperforming those simple estimators currently provided by most of the publicly available software.
Title: Cooperative Automated Worm Response and Detection Immune Algorithm
Abstract: The role of T-cells within the immune system is to confirm and assess anomalous situations and then either respond to or tolerate the source of the effect. To illustrate how these mechanisms can be harnessed to solve real-world problems, we present the blueprint of a T-cell inspired algorithm for computer security worm detection. We show how the three central T-cell processes, namely T-cell maturation, differentiation and proliferation, naturally map into this domain and further illustrate how such an algorithm fits into a complete immune inspired computer security system and framework.
Title: Computer Simulation Study of the Levy Flight Process
Abstract: Random walk simulation of the Levy flight shows a linear relation between the mean square displacement <r2> and time. We have analyzed different aspects of this linearity. It is shown that the restriction of jump length to a maximum value (lm) affects the diffusion coefficient, even though it remains constant for lm greater than 1464. So, this factor has no effect on the linearity. In addition, it is shown that the number of samples does not affect the results. We have demonstrated that the relation between the mean square displacement and time remains linear in a continuous space, while continuous variables just reduce the diffusion coefficient. The results are also implied that the movement of a levy flight particle is similar to the case the particle moves in each time step with an average length of jumping <l>. Finally, it is shown that the non-linear relation of the Levy flight will be satisfied if we use time average instead of ensemble average. The difference between time average and ensemble average results points that the Levy distribution may be a non-ergodic distribution.
Title: Comparing Simulation Output Accuracy of Discrete Event and Agent Based Models: A Quantitive Approach
Abstract: In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methids. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.
Title: Improved estimators for dispersion models with dispersion covariates
Abstract: In this paper we discuss improved estimators for the regression and the dispersion parameters in an extended class of dispersion models (J\orgensen, 1996). This class extends the regular dispersion models by letting the dispersion parameter vary throughout the observations, and contains the dispersion models as particular case. General formulae for the second-order bias are obtained explicitly in dispersion models with dispersion covariates, which generalize previous results by Botter and Cordeiro (1998), Cordeiro and McCullagh (1991), Cordeiro and Vasconcellos (1999), and Paula (1992). The practical use of the formulae is that we can derive closed-form expressions for the second-order biases of the maximum likelihood estimators of the regression and dispersion parameters when the information matrix has a closed-form. Various expressions for the second-order biases are given for special models. The formulae have advantages for numerical purposes because they require only a supplementary weighted linear regression. We also compare these bias-corrected estimators with two different estimators which are also bias-free to the second-order that are based on bootstrap methods. These estimators are compared by simulation.
Title: Skewness of maximum likelihood estimators in dispersion models
Abstract: We introduce the dispersion models with a regression structure to extend the generalized linear models, the exponential family nonlinear models (Cordeiro and Paula, 1989) and the proper dispersion models (J\orgensen, 1997a). We provide a matrix expression for the skewness of the maximum likelihood estimators of the regression parameters in dispersion models. The formula is suitable for computer implementation and can be applied for several important submodels discussed in the literature. Expressions for the skewness of the maximum likelihood estimators of the precision and dispersion parameters are also derived. In particular, our results extend previous formulas obtained by Cordeiro and Cordeiro (2001) and Cavalcanti et al. (2009). A simulation study is perfomed to show the practice importance of our results.
Title: DCA for Bot Detection
Abstract: Ensuring the security of computers is a non-trivial task, with many techniques used by malicious users to compromise these systems. In recent years a new threat has emerged in the form of networks of hijacked zombie machines used to perform complex distributed attacks such as denial of service and to obtain sensitive data such as password information. These zombie machines are said to be infected with a 'bot' - a malicious piece of software which is installed on a host machine and is controlled by a remote attacker, termed the 'botmaster of a botnet'. In this work, we use the biologically inspired Dendritic Cell Algorithm (DCA) to detect the existence of a single bot on a compromised host machine. The DCA is an immune-inspired algorithm based on an abstract model of the behaviour of the dendritic cells of the human body. The basis of anomaly detection performed by the DCA is facilitated using the correlation of behavioural attributes such as keylogging and packet flooding behaviour. The results of the application of the DCA to the detection of a single bot show that the algorithm is a successful technique for the detection of such malicious software without responding to normally running programs.
Title: Biological Inspiration for Artificial Immune Systems
Abstract: Artificial immune systems (AISs) to date have generally been inspired by naive biological metaphors. This has limited the effectiveness of these systems. In this position paper two ways in which AISs could be made more biologically realistic are discussed. We propose that AISs should draw their inspiration from organisms which possess only innate immune systems, and that AISs should employ systemic models of the immune system to structure their overall design. An outline of plant and invertebrate immune systems is presented, and a number of contemporary research that more biologically-realistic AISs could have is also discussed.
Title: Syllable Analysis to Build a Dictation System in Telugu language
Abstract: In recent decades, Speech interactive systems gained increasing importance. To develop Dictation System like Dragon for Indian languages it is most important to adapt the system to a speaker with minimum training. In this paper we focus on the importance of creating speech database at syllable units and identifying minimum text to be considered while training any speech recognition system. There are systems developed for continuous speech recognition in English and in few Indian languages like Hindi and Tamil. This paper gives the statistical details of syllables in Telugu and its use in minimizing the search space during recognition of speech. The minimum words that cover maximum syllables are identified. This words list can be used for preparing a small text which can be used for collecting speech sample while training the dictation system. The results are plotted for frequency of syllables and the number of syllables in each word. This approach is applied on the CIIL Mysore text corpus which is of 3 million words.
Title: Speech Recognition by Machine, A Review
Abstract: This paper presents a brief survey on Automatic Speech Recognition and discusses the major themes and advances made in the past 60 years of research, so as to provide a technological perspective and an appreciation of the fundamental progress that has been accomplished in this important area of speech communication. After years of research and development the accuracy of automatic speech recognition remains one of the important research challenges (e.g., variations of the context, speakers, and environment).The design of Speech Recognition system requires careful attentions to the following issues: Definition of various types of speech classes, speech representation, feature extraction techniques, speech classifiers, database and performance evaluation. The problems that are existing in ASR and the various techniques to solve these problems constructed by various research workers have been presented in a chronological order. Hence authors hope that this work shall be a contribution in the area of speech recognition. The objective of this review paper is to summarize and compare some of the well known methods used in various stages of speech recognition system and identify research topic and applications which are at the forefront of this exciting and challenging field.
Title: Application of a Fuzzy Programming Technique to Production Planning in the Textile Industry
Abstract: Many engineering optimization problems can be considered as linear programming problems where all or some of the parameters involved are linguistic in nature. These can only be quantified using fuzzy sets. The aim of this paper is to solve a fuzzy linear programming problem in which the parameters involved are fuzzy quantities with logistic membership functions. To explore the applicability of the method a numerical example is considered to determine the monthly production planning quotas and profit of a home textile group.
Title: The Application of Mamdani Fuzzy Model for Auto Zoom Function of a Digital Camera
Abstract: Mamdani Fuzzy Model is an important technique in Computational Intelligence (CI) study. This paper presents an implementation of a supervised learning method based on membership function training in the context of Mamdani fuzzy models. Specifically, auto zoom function of a digital camera is modelled using Mamdani technique. The performance of control method is verified through a series of simulation and numerical results are provided as illustrations.
Title: Statistical tests for whether a given set of independent, identically distributed draws does not come from a specified probability density
Abstract: We discuss several tests for whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
Title: A Little More, a Lot Better: Improving Path Quality by a Simple Path Merging Algorithm
Abstract: Sampling-based motion planners are an effective means for generating collision-free motion paths. However, the quality of these motion paths (with respect to quality measures such as path length, clearance, smoothness or energy) is often notoriously low, especially in high-dimensional configuration spaces. We introduce a simple algorithm for merging an arbitrary number of input motion paths into a hybrid output path of superior quality, for a broad and general formulation of path quality. Our approach is based on the observation that the quality of certain sub-paths within each solution may be higher than the quality of the entire path. A dynamic-programming algorithm, which we recently developed for comparing and clustering multiple motion paths, reduces the running time of the merging algorithm significantly. We tested our algorithm in motion-planning problems with up to 12 degrees of freedom. We show that our algorithm is able to merge a handful of input paths produced by several different motion planners to produce output paths of much higher quality.
Title: Dendritic Cells for Real-Time Anomaly Detection
Abstract: Dendritic Cells (DCs) are innate immune system cells which have the power to activate or suppress the immune system. The behaviour of human of human DCs is abstracted to form an algorithm suitable for anomaly detection. We test this algorithm on the real-time problem of port scan detection. Our results show a significant difference in artificial DC behaviour for an outgoing portscan when compared to behaviour for normal processes.
Title: Dendritic Cells for Anomaly Detection
Abstract: Artificial immune systems, more specifically the negative selection algorithm, have previously been applied to intrusion detection. The aim of this research is to develop an intrusion detection system based on a novel concept in immunology, the Danger Theory. Dendritic Cells (DCs) are antigen presenting cells and key to the activation of the human signals from the host tissue and correlate these signals with proteins know as antigens. In algorithmic terms, individual DCs perform multi-sensor data fusion based on time-windows. The whole population of DCs asynchronously correlates the fused signals with a secondary data stream. The behaviour of human DCs is abstracted to form the DC Algorithm (DCA), which is implemented using an immune inspired framework, libtissue. This system is used to detect context switching for a basic machine learning dataset and to detect outgoing portscans in real-time. Experimental results show a significant difference between an outgoing portscan and normal traffic.
Title: An Explicit Nonlinear Mapping for Manifold Learning
Abstract: Manifold learning is a hot research topic in the field of computer science and has many applications in the real world. A main drawback of manifold learning methods is, however, that there is no explicit mappings from the input data manifold to the output embedding. This prohibits the application of manifold learning methods in many practical problems such as classification and target detection. Previously, in order to provide explicit mappings for manifold learning methods, many methods have been proposed to get an approximate explicit representation mapping with the assumption that there exists a linear projection between the high-dimensional data samples and their low-dimensional embedding. However, this linearity assumption may be too restrictive. In this paper, an explicit nonlinear mapping is proposed for manifold learning, based on the assumption that there exists a polynomial mapping between the high-dimensional data samples and their low-dimensional representations. As far as we know, this is the first time that an explicit nonlinear mapping for manifold learning is given. In particular, we apply this to the method of Locally Linear Embedding (LLE) and derive an explicit nonlinear manifold learning algorithm, named Neighborhood Preserving Polynomial Embedding (NPPE). Experimental results on both synthetic and real-world data show that the proposed mapping is much more effective in preserving the local neighborhood information and the nonlinear geometry of the high-dimensional data samples than previous work.
Title: Sparsity-accuracy trade-off in MKL
Abstract: We empirically investigate the best trade-off between sparse and uniformly-weighted multiple kernel learning (MKL) using the elastic-net regularization on real and simulated datasets. We find that the best trade-off parameter depends not only on the sparsity of the true kernel-weight spectrum but also on the linear dependence among kernels and the number of samples.
Title: Analytical shape determination of fiber-like objects with Virtual Image Correlation
Abstract: This paper reports a method allowing for the determination of the shape of deformed fiber-like objects. Compared to existing methods, it provides analytical results including the local slope and curvature which are of first importance, for instance, in beam mechanics. The presented VIC (Virtual Image Correlation) method consists in looking for the best correlation between the image of the fiber-like object and a virtual beam image, using an algorithm close to the Digital Image Correlation method developed in experimental solid mechanics. The computation only involves the part of the image in the vicinity of the fiber: the method is thus insensitive to the picture background and the computational cost remains low. Two examples are reported: the first proves the precision of the method, the second its ability to identify a complex shape with multiple loops.
Title: Detecting Botnets Through Log Correlation
Abstract: Botnets, which consist of thousands of compromised machines, can cause significant threats to other systems by launching Distributed Denial of Service (SSoS) attacks, keylogging, and backdoors. In response to these threats, new effective techniques are needed to detect the presence of botnets. In this paper, we have used an interception technique to monitor Windows Application Programming Interface (API) functions calls made by communication applications and store these calls with their arguments in log files. Our algorithm detects botnets based on monitoring abnormal activity by correlating the changes in log file sizes from different hosts.
Title: Relaxation Penalties and Priors for Plausible Modeling of Nonidentified Bias Sources
Abstract: In designed experiments and surveys, known laws or design feat ures provide checks on the most relevant aspects of a model and identify the target parameters. In contrast, in most observational studies in the health and social sciences, the primary study data do not identify and may not even bound target parameters. Discrepancies between target and analogous identified parameters (biases) are then of paramount concern, which forces a major shift in modeling strategies. Conventional approaches are based on conditional testing of equality constraints, which correspond to implausible point-mass priors. When these constraints are not identified by available data, however, no such testing is possible. In response, implausible constraints can be relaxed into penalty functions derived from plausible prior distributions. The resulting models can be fit within familiar full or partial likelihood frameworks. The absence of identification renders all analyses part of a sensitivity analysis. In this view, results from single models are merely examples of what might be plausibly inferred. Nonetheless, just one plausible inference may suffice to demonstrate inherent limitations of the data. Points are illustrated with misclassified data from a study of sudden infant death syndrome. Extensions to confounding, selection bias and more complex data structures are outlined.
Title: Longitudinal Data with Follow-up Truncated by Death: Match the Analysis Method to Research Aims
Abstract: Diverse analysis approaches have been proposed to distinguish data missing due to death from nonresponse, and to summarize trajectories of longitudinal data truncated by death. We demonstrate how these analysis approaches arise from factorizations of the distribution of longitudinal data and survival information. Models are illustrated using cognitive functioning data for older adults. For unconditional models, deaths do not occur, deaths are independent of the longitudinal response, or the unconditional longitudinal response is averaged over the survival distribution. Unconditional models, such as random effects models fit to unbalanced data, may implicitly impute data beyond the time of death. Fully conditional models stratify the longitudinal response trajectory by time of death. Fully conditional models are effective for describing individual trajectories, in terms of either aging (age, or years from baseline) or dying (years from death). Causal models (principal stratification) as currently applied are fully conditional models, since group differences at one timepoint are described for a cohort that will survive past a later timepoint. Partly conditional models summarize the longitudinal response in the dynamic cohort of survivors. Partly conditional models are serial cross-sectional snapshots of the response, reflecting the average response in survivors at a given timepoint rather than individual trajectories. Joint models of survival and longitudinal response describe the evolving health status of the entire cohort. Researchers using longitudinal data should consider which method of accommodating deaths is consistent with research aims, and use analysis methods accordingly.
Title: Kernel machines with two layers and multiple kernel learning