text
stringlengths
12
14.7k
2022 Zambian census : The Zambia Statistics Agency is mandated to conduct the Census of Population and Housing (CPH) every 10 years as per Statistics Act No. 13 of 2018. The Census was supposed to be conducted in 2020 but was postponed to November 2021 due to funding challenges. In November 2021, the census failed to t...
2022 Zambian census : Zambia Statistics Agency conducted the sixth census of population and Housing. The enumeration and data collecting exercise was carried out from 18 August to 14 September 2022 by 45,000 Enumerators and Supervisors.
2022 Zambian census : Zambia's population by 2022 had increased to a total of 19,610,769; where the female population was 10,007,713 and the male population was 9,603,056. == References ==
Abelson's paradox : Abelson's paradox is an applied statistics paradox identified by Robert P. Abelson. The paradox pertains to a possible paradoxical relationship between the magnitude of the r2 (i.e., coefficient of determination) effect size and its practical meaning. Abelson's example was obtained from the analysis...
Abelson's paradox : List of paradoxes == References ==
Abundance estimation : Abundance estimation comprises all statistical methods for estimating the number of individuals in a population. In ecology, this may be anything from estimating the number of daisies in a field to estimating the number of blue whales in the ocean.
Abundance estimation : Distance Sampling: Estimating Abundance of Biological Populations – S. T. Buckland, D. R. Anderson, K. P. Burnham, J. L. Laake Estimating Abundance of African Wildlife: An Aid to Adaptive Management – Hugo Jachmann Advanced Distance Sampling: Estimating Abundance of Biological Populations Geostat...
Accuracy paradox : The accuracy paradox is the paradoxical finding that accuracy is not a good metric for predictive models when classifying in predictive analytics. This is because a simple model may have a high level of accuracy but too crude to be useful. For example, if the incidence of category A is dominant, bein...
Accuracy paradox : For example, a city of 1 million people has ten terrorists. A profiling system results in the following confusion matrix: Even though the accuracy is ⁠10 + 999000/1000000⁠ ≈ 99.9%, 990 out of the 1000 positive predictions are incorrect. The precision of ⁠10/10 + 990⁠ = 1% reveals its poor performance...
Accuracy paradox : Kubat, M. (2000). Addressing the Curse of Imbalanced Training Sets: One-Sided Selection. Fourteenth International Conference on Machine Learning.
Accuracy paradox : False positive paradox == References ==
Aggregate pattern : An Aggregate pattern can refer to concepts in either statistics or computer programming. Both uses simplify complexity into smaller, simpler parts.
Aggregate pattern : An aggregate pattern is an important statistical concept in many fields that rely on statistics to predict the behavior of large groups, based on the tendencies of subgroups to consistently behave in a certain way. It is particularly useful in sociology, economics, psychology, and criminology.
Aggregate pattern : In Design Patterns, an aggregate is not a design pattern but rather refers to an object such as a list, vector, or generator which provides an interface for creating iterators. The following example code is in Python. Python hides essentially all of the details using the iterator protocol. Confusing...
Aggregate pattern : Visitor pattern Template class Facade pattern Type safety Functional programming == References ==
Analysis of molecular variance : Analysis of molecular variance (AMOVA), is a statistical model for the molecular algorithm in a single species, typically biological. The name and model are inspired by ANOVA. The method was developed by Laurent Excoffier, Peter Smouse and Joseph Quattro at Rutgers University in 1992. S...
Analysis of molecular variance : Arlequin 3 website Online AMOVA Calculation for Y-STR Data Info-Gen website GenAIEx website
Analysis of rhythmic variance : In statistics, analysis of rhythmic variance (ANORVA) is a method for detecting rhythms in biological time series, published by Peter Celec (Biol Res. 2004, 37(4 Suppl A):777–82). It is a procedure for detecting cyclic variations in biological time series and quantification of their prob...
Analysis of rhythmic variance : Analysis of rhythmic variance--ANORVA. A new simple method for detecting rhythms in biological time series. Analysis of Rhythmic Variance
Andrews plot : In data visualization, an Andrews plot or Andrews curve is a way to visualize structure in high-dimensional data. It is basically a rolled-down, non-integer version of the Kent–Kiviat radar m chart, or a smoothed version of a parallel coordinate plot. It is named after the statistician David F. Andrews. ...
Antecedent variable : In statistics and social sciences, an antecedent variable is a variable that cannot help to explain the apparent relationship (or part of the relationship) between other variables that are nominally in a cause and effect relationship. In a regression analysis, an antecedent variable would be one t...
Antecedent variable : Path analysis (statistics) Latent variable Intervening variable Confounding variable
Antecedent variable : Olobatuyi, M. E. (2006). "Definition of Basic Terms and Concepts". A User's Guide to Path Analysis. University Press of America. pp. 21–52. ISBN 0-7618-3230-0.
Area chart : An area chart or area graph displays graphically quantitative data. It is based on the line chart. The area between axis and line are commonly emphasized with colors, textures and hatchings. Commonly one compares two or more quantities with an area chart.
Area chart : William Playfair is usually credited with inventing the area charts as well as the line, bar, and pie charts. His book The Commercial and Political Atlas, published in 1786, contained a number of time-series graphs, including Interest of the National Debt from the Revolution and Chart of all the Imports an...
Area chart : Area charts are used to represent cumulated totals using numbers or percentages (stacked area charts in this case) over time. Use the area chart for showing trends over time among related attributes. The area chart is like the plot chart except that the area below the plotted line is filled in with color t...
Area chart : Area charts which use vertical and horizontal lines to connect the data points in a series forming a step-like progression are called step-area charts. Area charts in which data points are connected by smooth curves instead of straight lines are called spline-area charts. Stacked area charts in which the a...
Area compatibility factor : In survival analysis, the area compatibility factor, F, is used in indirect standardisation of population mortality rates. F = ∑ x s E x , t c s m x , t ∑ x s E x , t c / ∑ x E x , t c s m x , t ∑ x E x , t c ^E_^^m_^E_^\left/E_^^m_E_^\right. where: s E x , t c ^E_^ is the standardised centr...
Armitage–Doll multistage model of carcinogenesis : The Armitage–Doll model is a statistical model of carcinogenesis, proposed in 1954 by Peter Armitage and Richard Doll, in which a series of discrete mutations result in cancer. The original paper has recently been reprinted with a set of commentary articles.
Armitage–Doll multistage model of carcinogenesis : The rate of incidence and mortality from a wide variety of common cancers follows a power law: someone's risk of developing a cancer increases with a power of their age. The model is very simple, and reads r a t e = N p 1 p 2 p 3 ⋯ p r ( r − 1 ) ! t r − 1 =p_p_\cdots ...
Armitage–Doll multistage model of carcinogenesis : This was some of the earliest strong evidence that cancer was the result of an accumulation of mutations. With their 1954 paper, Armitage and Doll began a line of research that led to Knudson's two-hit hypothesis and thus the discovery of tumour suppressor genes.
Armitage–Doll multistage model of carcinogenesis : Steven A Frank (2004) "Commentary: Mathematical models of cancer progression and epidemiology in the age of high throughput genomics", Int. J. Epidemiol. 33(6): 1179-1181 doi:10.1093/ije/dyh222 Suresh H Moolgavkar (2004) "Commentary: Fifty years of the multistage model...
Artificial precision : In numerical mathematics, artificial precision is a source of error that occurs when a numerical value or semantic is expressed with more precision than was initially provided from measurement or user input. For example, a person enters their birthday as the date 1984-01-01 but it is stored in a ...
Artificial precision : false precision accuracy and precision significant figures
Artificial precision : Smith, N. J. J. (2008). "Worldly Vagueness and Semantic Indeterminacy". Vagueness and Degrees of Truth. pp. 277–316. doi:10.1093/acprof:oso/9780199233007.003.0007. ISBN 9780199233007.
Association of Statisticians of American Religious Bodies : The Association of Statisticians of American Religious Bodies (ASARB) is an American non-profit organization that brings together statisticians from various religious groups in the United States, with the aim of compiling accurate statistics regarding all such...
Average daily rate : Average Daily Rate (commonly referred to as ADR) is a statistical unit that is often used in the lodging industry. The number represents the average rental income per paid occupied room in a given time period. ADR along with the property's occupancy are the foundations for the property's financial ...
Average daily rate : ADR can vary significantly due to external factors like seasonal demand, local events, or economic conditions. Understanding these variables can help hotel management make more informed pricing decisions.
Average daily rate : ADR is calculated by dividing the rooms revenue earned by the number of rooms sold, with house use rooms and complimentary rooms excluded from the denominators.
Backus–Gilbert method : In mathematics, the Backus–Gilbert method, also known as the optimally localized average (OLA) method is named for its discoverers, geophysicists George E. Backus and James Freeman Gilbert. It is a regularization method for obtaining meaningful solutions to ill-posed inverse problems. Where othe...
Backus–Gilbert method : Backus, G.E., and Gilbert, F. 1968, "The Resolving power of Gross Earth Data", Geophysical Journal of the Royal Astronomical Society, vol. 16, pp. 169–205. Backus, G.E., and Gilbert, F. 1970, "Uniqueness in the Inversion of inaccurate Gross Earth Data", Philosophical Transactions of the Royal So...
Barber–Johnson diagram : A Barber–Johnson diagram is a method of presenting hospital statistics combining four different variables in a unique graph, introduced in 1973. The method constructs a scattergram where length of stay, turnover interval, discharges, and deaths per available bed are combined. These four variabl...
Bartlett's method : In time series analysis, Bartlett's method (also known as the method of averaged periodograms), is used for estimating power spectra. It provides a way to reduce the variance of the periodogram in exchange for a reduction of resolution, compared to standard periodograms. A final estimate of the spec...
Bartlett's method : Bartlett’s method consists of the following steps: The original N point data segment is split up into K (non-overlapping) data segments, each of length M For each segment, compute the periodogram by computing the discrete Fourier transform (DFT version which does not divide by M), then computing the...
Bartlett's method : The Welch method: this is a method that uses a modified version of Bartlett’s method in which the portions of the series contributing to each periodogram are allowed to overlap. Periodogram smoothing.
Bartlett's method : Proakis, John G.; Manolakis, Dimitri G. (1996), Digital Signal Processing: Principles, Algorithms and Applications (3 ed.), Pearson Education, pp. 910–911, ISBN 0-13-394289-9 Proakis, John G.; Manolakis, Dimitri G. (1996), Digital Signal Processing: Principles, Algorithms and Applications (3 ed.), U...
Basic statistical unit (Norway) : The basic statistical unit (Norwegian: Grunnkrets) is a type of statistical unit used by Statistics Norway to provide stable and coherent geographical units for regional statistics in Norway. Basic statistical units are subdivisions of municipalities (they never include land in more th...
Basic statistical unit (Norway) : "New figures for basic units". Statistics Norway. May 30, 2007. Archived from the original on 29 December 2008. Retrieved 2008-12-08.
Carmen Batanero : Carmen Batanero is a Spanish statistics educator, and a Senior Lecturer in the Mathematics Department at the University of Granada, Spain. She is known as an advocate for statistics education. Batanero is a lifetime member of the International Association for Statistical Education, and served as the a...
Carmen Batanero : Batanero, Carmen; Burrill, Gail; Reading, Chris (2011). Teaching Statistics in School Mathematics-Challenges for Teaching and Teacher Education. Springer. ISBN 978-94-007-1131-0. Batanero, Carmen; Borovcnik, Manfred (2016). Statistics and Probability in High School. Sense Publishers. ISBN 978-94-630-0...
Bayes error rate : In statistical classification, Bayes error rate is the lowest possible error rate for any classifier of a random outcome (into, for example, one of two categories) and is analogous to the irreducible error. A number of approaches to the estimation of the Bayes error rate exist. One method seeks to ob...
Bayes error rate : In terms of machine learning and pattern classification, the labels of a set of random observations can be divided into 2 or more classes. Each observation is called an instance and the class it belongs to is the label. The Bayes error rate of the data distribution is the probability an instance is m...
Bayes error rate : Proof that the Bayes error rate is indeed the minimum possible and that the Bayes classifier is therefore optimal, may be found together on the Wikipedia page Bayes classifier.
Bayes error rate : A plug-in rule uses an estimate of the posterior probability η to form a classification rule. Given an estimate η ~ , the excess Bayes error rate of the associated classifier is bounded above by: 2 E [ | η ( X ) − η ~ ( X ) | ] . [|\eta (X)-(X)|]. To see this, note that the excess Bayes error is e...
Bayes error rate : Naive Bayes classifier == References ==
Bayesian average : A Bayesian average is a method of estimating the mean of a population using outside information, especially a pre-existing belief, which is factored into the calculation. This is a central feature of Bayesian interpretation. This is useful when the available data set is small. Calculating the Bayesia...
Bayesian average : Additive smoothing
Bayesian average : Yang, Xiao; Zhang, Zhaoxin (2013). "Combining prestige and relevance ranking for personalized recommendation". Proceedings of the 22nd ACM international conference on Conference on information & knowledge management - CIKM '13. pp. 1877–1880. doi:10.1145/2505515.2507885. ISBN 9781450322638. S2CID 144...
Bayesian inference using Gibbs sampling : Bayesian inference using Gibbs sampling (BUGS) is a statistical software for performing Bayesian inference using Markov chain Monte Carlo (MCMC) methods. It was developed by David Spiegelhalter at the Medical Research Council Biostatistics Unit in Cambridge in 1989 and released...
Bayesian inference using Gibbs sampling : Spike and slab variable selection Bayesian structural time series
Bayesian survival analysis : Survival analysis is normally carried out using parametric models, semi-parametric models, non-parametric models to estimate the survival rate in clinical research. However recently Bayesian models are also used to estimate the survival rate due to their ability to handle design and analysi...
Bellman filter : The Bellman filter is an algorithm that estimates the value sequence of hidden states in a state-space model. It is a generalization of the Kalman filter, allowing for nonlinearity in both the state and observation equations. The principle behind the Bellman filter is an approximation of the maximum a ...
Berkson error model : The Berkson error model is a description of random error (or misclassification) in measurement. Unlike classical error, Berkson error causes little or no bias in the measurement. It was proposed by Joseph Berkson in an article entitled “Are there two regressions?,” published in 1950. An example of...
Berkson error model : Buonaccorsi, John P. (2010). Measurement Error: Models, Methods, and Applications. CRC Press. pp. 76–78. ISBN 978-1-4200-6658-6. Carroll, R. J.; Ruppert, D.; Stefanski, L. A. (2006). Measurement Error in Nonlinear Models (Second ed.). London: Chapman & Hall. pp. 26–32. ISBN 1-4200-1013-1.
Binary regression : In statistics, specifically regression analysis, a binary regression estimates a relationship between one or more explanatory variables and a single output binary variable. Generally the probability of the two alternatives is modeled, instead of simply outputting a single value, as in linear regress...
Binary regression : Binary regression is principally applied either for prediction (binary classification), or for estimating the association between the explanatory variables and the output. In economics, binary regressions are used to model binary choice.
Binary regression : Binary regression models can be interpreted as latent variable models, together with a measurement model; or as probabilistic models, directly modeling the probability.
Binary regression : Generalized linear model § Binary data Fractional model == References ==
Blockmodeling linked networks : Blockmodeling linked networks is an approach in blockmodeling in analysing the linked networks. Such approach is based on the generalized multilevel blockmodeling approach.: 259 The main objective of this approach is to achieve clustering of the nodes from all involved sets, while at the...
Box–Cox distribution : In statistics, the Box–Cox distribution (also known as the power-normal distribution) is the distribution of a random variable X for which the Box–Cox transformation on X follows a truncated normal distribution. It is a continuous probability distribution having probability density function (pdf)...
Box–Cox distribution : ƒ = 1 gives a truncated normal distribution.
Box–Cox distribution : Freeman, Jade; Reza Modarres. "Properties of the Power-Normal Distribution" (PDF). U.S. Environmental Protection Agency.
Box's M test : Box's M test is a multivariate statistical test used to check the equality of multiple variance-covariance matrices. The test is commonly used to test the assumption of homogeneity of variances and covariances in MANOVA and linear discriminant analysis. It is named after George E. P. Box, who first discu...
Box's M test : Bartlett's test Levene's test == References ==
Brunner Munzel Test : In statistics, the Brunner Munzel test (also called the generalized Wilcoxon test) is a nonparametric test of the null hypothesis that, for randomly selected values X and Y from two populations, the probability of X being greater than Y is equal to the probability of Y being greater than X. It is ...
Brunner Munzel Test : All the observations from both groups are independent of each other, The responses are at least ordinal (i.e., one can at least say, of any two observations, which is the greater), Under the null hypothesis H0, is that the probability of an observation from population X exceeding an observation fr...
Brunner Munzel Test : The Brunner Munzel test is available in the following packages R: brunnermunzel, lawstat, rankFD (function rank.two.samples()) Python (programming language): scipy.stats.brunnermunzel jamovi: bmtest == References ==
Canonical correspondence analysis : In multivariate analysis, canonical correspondence analysis (CCA) is an ordination technique that determines axes from the response data as a unimodal combination of measured predictors. CCA is commonly used in ecology in order to extract gradients that drive the composition of ecolo...
Canonical correspondence analysis : CCA was developed in 1986 by Cajo ter Braak and implemented in the program CANOCO, an extension of DECORANA. To date, CCA is one of the most popular multivariate methods in ecology, despite the availability of contemporary alternatives. CCA was originally derived and implemented usin...
Canonical correspondence analysis : The requirements of a CCA are that the samples are random and independent. Also, the data are categorical and that the independent variables are consistent within the sample site and error-free. The original publication states the need for equal species tolerances, equal species maxi...
Canonical correspondence analysis : Canonical correlation analysis (CANCOR) == References ==
Chain linking : Chain linking is a statistical method, defined by the Organisation for Economic Co-operation and Development as: Joining together two indices that overlap in one period by rescaling one of them to make its value equal to that of the other in the same period, thus combining them into single time series. ...
Cheeger bound : In mathematics, the Cheeger bound is a bound of the second largest eigenvalue of the transition matrix of a finite-state, discrete-time, reversible stationary Markov chain. It can be seen as a special case of Cheeger inequalities in expander graphs. Let X be a finite set and let K ( x , y ) be the tra...
Cheeger bound : Stochastic matrix Cheeger constant Conductance == References ==
Chernoff's distribution : In probability theory, Chernoff's distribution, named after Herman Chernoff, is the probability distribution of the random variable Z = argmax s ∈ R ( W ( s ) − s 2 ) , \ (W(s)-s^), where W is a "two-sided" Wiener process (or two-sided "Brownian motion") satisfying W(0) = 0. If V ( a , c ) =...
Chernoff's distribution : Groeneboom, Lalley and Temme state that the first investigation of this distribution was probably by Chernoff in 1964, who studied the behavior of a certain estimator of a mode. In his paper, Chernoff characterized the distribution through an analytic representation through the heat equation w...
China microcensus : The China microcensus (全国1%人口抽样调查) is an intercensal survey to measure the population, in between official censuses. It is conducted every year that ends in 5. In 2015, the survey began on November 1 at midnight. Data are broken down to at least the municipal level, and includes residency (hukou) an...
Church of Scotland Yearbook : The Church of Scotland Yearbook (known informally as the Red Book because of its red binding) is a collection of statistical data published annually by the Church of Scotland. It was first published in 1886, and has been published annually ever since. A new free version is sent to every mi...
Code officiel géographique : The Code officiel géographique (English: Official geographic code) is a document listing the INSEE code which defines some French geographical codes.
Cohort effect : The term cohort effect is used in social science to describe shared characteristics over time among individuals who are grouped by a shared temporal experience, such as year of birth, or common life experience, such as time of exposure to radiation. Researchers evaluate this phenomenon using a cohort an...
Cohort effect : Political socialization Socialization Cohort
Cohort effect : Cohort Effects on Earnings Profiles Cohort effect in Lung Function among Smokers Archived 2006-07-14 at the Wayback Machine
Combinatorial data analysis : In statistics, combinatorial data analysis (CDA) is the study of data sets where the order in which objects are arranged is important. CDA can be used either to determine how well a given combinatorial construct reflects the observed data, or to search for a suitable combinatorial construc...
Combinatorial data analysis : Cluster analysis Geometric data analysis Structured data analysis (statistics) Seriation (statistics) == References ==
Common-method variance : In applied statistics, (e.g., applied to the social sciences and psychometrics), common-method variance (CMV) is the spurious "variance that is attributable to the measurement method rather than to the constructs the measures are assumed to represent" or equivalently as "systematic error varian...
Complete class theorem : The Complete class theorems is a class of theorems in decision theory. They establish that all admissible decision rules are equivalent to the Bayesian decision rule for some utility function and some prior distribution (or for the limit of a sequence of prior distributions). Thus, for every de...
Complex Wishart distribution : In statistics, the complex Wishart distribution is a complex version of the Wishart distribution. It is the distribution of n times the sample Hermitian covariance matrix of n zero-mean independent Gaussian random variables. It has support for p × p Hermitian positive definite matrices...
Complex Wishart distribution : The probability distribution of the eigenvalues of the complex Hermitian Wishart distribution are given by, for example, James and Edelman. For a p × p matrix with ν ≥ p degrees of freedom we have f ( λ 1 … λ p ) = K ~ ν , p exp ⁡ ( − 1 2 ∑ i = 1 p λ i ) ∏ i = 1 p λ i ν − p ∏ i < j ( λ ...
Component analysis (statistics) : Component analysis is the analysis of two or more independent variables which comprise a treatment modality. It is also known as a dismantling study. The chief purpose of the component analysis is to identify the component which is efficacious in changing behavior, if a singular compon...
Composite measure : Composite measure in statistics and research design refer to composite measures of variables, i.e. measurements based on multiple data items. An example of a composite measure is an IQ test, which gives a single score based on a series of responses to various questions. Three common composite measur...