A graph of a normal bell curve showing statistics used in standardized testing assessment. The normal distribution, also called the Gaussian distribution, is an important family of Continuous probability distributions applicable in many fields A standardized test is a test administered and scored in a consistent manner The scales include standard deviations, cumulative percentages, percentile equivalents, Z-scores, T-scores, standard nines, and percentages in standard nines. In Probability and Statistics, the standard deviation is a measure of the dispersion of a collection of values

Statistics is a mathematical science pertaining to the collection, analysis, interpretation or explanation, and presentation of data. Mathematics is the body of Knowledge and Academic discipline that studies such concepts as Quantity, Structure, Space and Debt AIDS Trade in Africa (or DATA) is a Multinational non-government organization founded in January 2002 in London by U2 's It is applicable to a wide variety of academic disciplines, from the natural and social sciences to the humanities, and to government and business. An academic discipline or field of study is a branch of Knowledge which is taught or Researched at the college or university level Science (from the Latin scientia, meaning " Knowledge " or "knowing" is the effort to discover, and increase human understanding The humanities are academic disciplines which study the Human condition, using methods that are primarily Analytic, Critical, or Speculative

Statistical methods can be used to summarize or describe a collection of data; this is called descriptive statistics. Descriptive Statistics are used to describe the basic features of the Data gathered from an experimental study in various ways In addition, patterns in the data may be modeled in a way that accounts for randomness and uncertainty in the observations, and then used to draw inferences about the process or population being studied; this is called inferential statistics. Note The term model has a different meaning in Model theory, a branch of Mathematical logic. Randomness is a lack of order Purpose, cause, or predictability Inferential statistics or statistical induction comprises the use of Statistics to make Inferences concerning some unknown aspect of a Population Both descriptive and inferential statistics comprise applied statistics. There is also a discipline called mathematical statistics, which is concerned with the theoretical basis of the subject. Mathematical statistics is the study of Statistics from a purely mathematical standpoint using Probability theory as well as other branches of Mathematics

The word statistics is also the plural of statistic (singular), which refers to the result of applying a statistical algorithm to a set of data, as in economic statistics, crime statistics, etc. A statistic (singular is the result of applying a function (statistical Algorithm) to a set of data. Economic statistics is a branch of Applied statistics focusing on the collection processing compilation and dissemination of statistics concerning the Economy of Crime statistics attempt to provide a statistical measure of the level or amount of- Crime that is prevalent in societies

## History

Main article: History of statistics

"Five men, Conring, Achenwall, Süssmilch, Graunt and Petty have been honored by different writers as the founder of statistics. Statistics arose no later than the 18th century, from the need of states to collect data on their people and economies in order to administer them Hermann Conring ( November 9, 1606 &ndash December 12, 1681) was a North German intellectual Gottfried Achenwall ( 20 October 1719 - 1 May 1772) was a German philosopher and Statistician. Johann Peter Süßmilch (or Süssmilch September 3, 1707 in Zehlendorf - March 22 1767 in Berlin was a German John Graunt ( April 24, 1620 - April 18, 1674) was one of the first Demographers though by profession he was a Haberdasher Sir William Petty ( May 27 1623 – December 16 1687) was an English Economist, scientist and Philosopher " claims one source (Willcox, Walter (1938) The Founder of Statistics. Review of the International Statistical Institute 5(4):321-328. )

Some scholars pinpoint the origin of statistics to 1662, with the publication of "Observations on the Bills of Mortality" by John Graunt. Early applications of statistical thinking revolved around the needs of states to base policy on demographic and economic data. The scope of the discipline of statistics broadened in the early 19th century to include the collection and analysis of data in general. The 19th century of the Common Era began on January 1, 1801 and ended on December 31, 1900, according to the Gregorian calendar Today, statistics is widely employed in government, business, and the natural and social sciences.

Because of its empirical roots and its applications, statistics is generally considered not to be a subfield of pure mathematics, but rather a distinct branch of applied mathematics. Its mathematical foundations were laid in the 17th century with the development of probability theory by Pascal and Fermat. Probability theory is the branch of Mathematics concerned with analysis of random phenomena Pierre de Fermat pjɛːʁ dəfɛʁ'ma ( 17 August 1601 or 1607/8 &ndash 12 January 1665) was a French Lawyer at the Probability theory arose from the study of games of chance. The method of least squares was first described by Carl Friedrich Gauss around 1794. The method of least squares is used to solve Overdetermined systems Least squares is often applied in statistical contexts particularly Regression analysis. Johann Carl Friedrich Gauss (ˈɡaʊs, Gauß Carolus Fridericus Gauss ( 30 April 1777 – 23 February 1855) was a German The use of modern computers has expedited large-scale statistical computation, and has also made possible new methods that are impractical to perform manually. A computer is a Machine that manipulates data according to a list of instructions.

## Overview

In applying statistics to a scientific, industrial, or societal problem, one begins with a process or population to be studied. In Statistics, a statistical population is a set of entities concerning which Statistical inferences are to be drawn often based on a Random sample This might be a population of people in a country, of crystal grains in a rock, or of goods manufactured by a particular factory during a given period. It may instead be a process observed at various times; data collected about this kind of "population" constitute what is called a time series. In Statistics, Signal processing, and many other fields a time series is a sequence of Data points measured typically at successive times spaced at (often

For practical reasons, rather than compiling data about an entire population, one usually studies a chosen subset of the population, called a sample. Sampling is that part of Statistical practice concerned with the selection of individual observations intended to yield some knowledge about a population of concern Data are collected about the sample in an observational or experimental setting. In scientific inquiry an experiment ( Latin: Ex- periri, "to try out" is a method of investigating particular types of research questions or The data are then subjected to statistical analysis, which serves two related purposes: description and inference.

• Descriptive statistics can be used to summarize the data, either numerically or graphically, to describe the sample. Descriptive Statistics are used to describe the basic features of the Data gathered from an experimental study in various ways Basic examples of numerical descriptors include the mean and standard deviation. In Statistics, mean has two related meanings the Arithmetic mean (and is distinguished from the Geometric mean or Harmonic mean In Probability and Statistics, the standard deviation is a measure of the dispersion of a collection of values Graphical summarizations include various kinds of charts and graphs.
• Inferential statistics is used to model patterns in the data, accounting for randomness and drawing inferences about the larger population. Inferential statistics or statistical induction comprises the use of Statistics to make Inferences concerning some unknown aspect of a Population These inferences may take the form of answers to yes/no questions (hypothesis testing), estimates of numerical characteristics (estimation), descriptions of association (correlation), or modeling of relationships (regression). A statistical hypothesis test is a method of making statistical decisions using experimental data Estimation is the calculated Approximation of a result which is usable even if Input data may be incomplete or uncertain. In Probability theory and Statistics, correlation, (often measured as a correlation coefficient) indicates the strength and direction of a linear In statistics regression analysis is a collective name for techniques for the modeling and analysis of numerical data consisting of values of a Dependent variable (response Other modeling techniques include ANOVA, time series, and data mining. Note The term model has a different meaning in Model theory, a branch of Mathematical logic. In Statistics, ANOVA is short for analysis of variance Analysis of variance is a collection of Statistical models and their associated procedures in which the observed In Statistics, Signal processing, and many other fields a time series is a sequence of Data points measured typically at successive times spaced at (often Data mining is the process of Sorting through large amounts of data and picking out relevant information
 “… it is only the manipulation of uncertainty that interests us. We are not concerned with the matter that is uncertain. Thus we do not study the mechanism of rain; only whether it will rain. ”Dennis Lindley, "The Philosophy of Statistics", The Statistician (2000). Dennis Victor Lindley (born July 25, 1923) is a British statistician, decision theorist and leading advocate of Bayesian statistics.

The concept of correlation is particularly noteworthy. Statistical analysis of a data set may reveal that two variables (that is, two properties of the population under consideration) tend to vary together, as if they are connected. A data set (or dataset) is a collection of Data, usually presented in tabular form For example, a study of annual income and age of death among people might find that poor people tend to have shorter lives than affluent people. The two variables are said to be correlated (which is a positive correlation in this case). However, one cannot immediately infer the existence of a causal relationship between the two variables. (See Correlation does not imply causation. Correlation does not imply causation is a phrase used in the Sciences and Statistics to emphasize that Correlation between two variables does not imply ) The correlated phenomena could be caused by a third, previously unconsidered phenomenon, called a lurking variable or confounding variable. In statistics a confounding variable (also confounding factor, lurking variable, a confound, or confounder) is an Extraneous variable In statistics a confounding variable (also confounding factor, lurking variable, a confound, or confounder) is an Extraneous variable

If the sample is representative of the population, then inferences and conclusions made from the sample can be extended to the population as a whole. A major problem lies in determining the extent to which the chosen sample is representative. Statistics offers methods to estimate and correct for randomness in the sample and in the data collection procedure, as well as methods for designing robust experiments in the first place. (See experimental design. Design of experiments, or experimental design, is the design of all information-gathering exercises where variation is present whether under the full control of the experimenter )

The fundamental mathematical concept employed in understanding such randomness is probability. Probability is the likelihood or chance that something is the case or will happen Mathematical statistics (also called statistical theory) is the branch of applied mathematics that uses probability theory and analysis to examine the theoretical basis of statistics. Mathematical statistics is the study of Statistics from a purely mathematical standpoint using Probability theory as well as other branches of Mathematics The theory of statistics includes a number of topics Statistical models of the sources of data and typical problem formulation Sampling from a finite Applied mathematics is a branch of Mathematics that concerns itself with the mathematical techniques typically used in the application of mathematical knowledge to other domains Analysis has its beginnings in the rigorous formulation of Calculus.

The use of any statistical method is valid only when the system or population under consideration satisfies the basic mathematical assumptions of the method. Misuse of statistics can produce subtle but serious errors in description and interpretation — subtle in the sense that even experienced professionals sometimes make such errors, serious in the sense that they may affect, for instance, social policy, medical practice and the reliability of structures such as bridges. A misuse of statistics occurs when a statistical argument asserts a falsehood Even when statistics is correctly applied, the results can be difficult for the non-expert to interpret. For example, the statistical significance of a trend in the data, which measures the extent to which the trend could be caused by random variation in the sample, may not agree with one's intuitive sense of its significance. In Statistics, a result is called statistically significant if it is unlikely to have occurred by Chance. The set of basic statistical skills (and skepticism) needed by people to deal with information in their everyday lives is referred to as statistical literacy. Statistical literacy is a term used to describe an individual's or group's ability to understand Statistics.

## Statistical methods

### Experimental and observational studies

A common goal for a statistical research project is to investigate causality, and in particular to draw a conclusion on the effect of changes in the values of predictors or independent variables on response or dependent variables. Causality (but not causation) denotes a necessary relationship between one event (called cause and another event (called effect) which is the direct consequence Dependent variables and independent variables refer to values that change in relationship to each other Dependent variables and independent variables refer to values that change in relationship to each other There are two major types of causal statistical studies, experimental studies and observational studies. In both types of studies, the effect of differences of an independent variable (or variables) on the behavior of the dependent variable are observed. The difference between the two types lies in how the study is actually conducted. Each can be very effective.

An experimental study involves taking measurements of the system under study, manipulating the system, and then taking additional measurements using the same procedure to determine if the manipulation has modified the values of the measurements. In contrast, an observational study does not involve experimental manipulation. Instead, data are gathered and correlations between predictors and response are investigated.

An example of an experimental study is the famous Hawthorne studies, which attempted to test the changes to the working environment at the Hawthorne plant of the Western Electric Company. The Hawthorne effect is a form of reactivity, and describes a temporary change to behavior or performance in response to a change in the environmental conditions with the response The researchers were interested in determining whether increased illumination would increase the productivity of the assembly line workers. An assembly line is a Manufacturing process in which parts (usually Interchangeable parts) are added to a product in a sequential manner using optimally planned The researchers first measured the productivity in the plant, then modified the illumination in an area of the plant and checked if the changes in illumination affected the productivity. It turned out that the productivity indeed improved (under the experimental conditions). (See Hawthorne effect. The Hawthorne effect is a form of reactivity, and describes a temporary change to behavior or performance in response to a change in the environmental conditions with the response ) However, the study is heavily criticized today for errors in experimental procedures, specifically for the lack of a control group and blindedness. Scientific controls allow Experiments to study one Variable at a time and are a vital part of the Scientific method. The blind method is a part of the Scientific method, used to prevent research outcomes from being influenced by either the Placebo effect or the Observer

An example of an observational study is a study which explores the correlation between smoking and lung cancer. This type of study typically uses a survey to collect observations about the area of interest and then performs statistical analysis. In this case, the researchers would collect observations of both smokers and non-smokers, perhaps through a case-control study, and then look for the number of cases of lung cancer in each group. Case-control is a type of Epidemiological Study design. Case-control studies are used to identify factors that may contribute to a medical condition by comparing subjects

The basic steps of an experiment are;

1. Planning the research, including determining information sources, research subject selection, and ethical considerations for the proposed research and method. Ethics is a major branch of Philosophy, encompassing right conduct and good life
2. Design of experiments, concentrating on the system model and the interaction of independent and dependent variables. Design of experiments, or experimental design, is the design of all information-gathering exercises where variation is present whether under the full control of the experimenter
3. Summarizing a collection of observations to feature their commonality by suppressing details. In Descriptive statistics, summary statistics are used to summarize a set of observations in order to communicate as much as possible as simply as possible (Descriptive statistics)
4. Reaching consensus about what the observations tell about the world being observed. Descriptive Statistics are used to describe the basic features of the Data gathered from an experimental study in various ways Inferential statistics or statistical induction comprises the use of Statistics to make Inferences concerning some unknown aspect of a Population (Statistical inference)
5. Documenting / presenting the results of the study. Inferential statistics or statistical induction comprises the use of Statistics to make Inferences concerning some unknown aspect of a Population

### Levels of measurement

See: Stanley Stevens' "Scales of measurement" (1946): nominal, ordinal, interval, ratio

There are four types of measurements or levels of measurement or measurement scales used in statistics: nominal, ordinal, interval, and ratio. The level of Measurement of a Variable in Mathematics and Statistics is a classification that is used to describe the nature of information The level of Measurement of a Variable in Mathematics and Statistics is a classification that is used to describe the nature of information They have different degrees of usefulness in statistical research. Research is defined as Human activity based on Intellectual application in the investigation of Matter. Ratio measurements have both a zero value defined and the distances between different measurements defined; they provide the greatest flexibility in statistical methods that can be used for analyzing the data. Interval measurements have meaningful distances between measurements defined, but have no meaningful zero value defined (as in the case with IQ measurements or with temperature measurements in Fahrenheit). Fahrenheit is a temperature scale named after Daniel Gabriel Fahrenheit (1686–1736 a German Physicist who proposed it in 1724 Ordinal measurements have imprecise differences between consecutive values, but have a meaningful order to those values. Nominal measurements have no meaningful rank order among values.

Since variables conforming only to nominal or ordinal measurements cannot be reasonably measured numerically, sometimes they are called together as categorical variables, whereas ratio and interval measurements are grouped together as quantitative or continuous variables due to their numerical nature.

### Statistical techniques

Some well known statistical tests and procedures for research observations are:

## Specialized disciplines

Some fields of inquiry use applied statistics so extensively that they have specialized terminology. A statistical hypothesis test is a method of making statistical decisions using experimental data Research is defined as Human activity based on Intellectual application in the investigation of Matter. Observation is either an activity of a living being (such as a Human) which senses and assimilates the Knowledge of a Phenomenon, or the recording of data A t -test is any statistical hypothesis test in which the test statistic has a Student's ''t'' distribution if the Null hypothesis is true "Chi-square test" is often shorthand for Pearson's chi-square test. In Statistics, ANOVA is short for analysis of variance Analysis of variance is a collection of Statistical models and their associated procedures in which the observed In Statistics, the Mann-Whitney U test (also called the Mann-Whitney-Wilcoxon ( MWW) Wilcoxon rank-sum test, or Wilcoxon-Mann-Whitney In statistics regression analysis is a collective name for techniques for the modeling and analysis of numerical data consisting of values of a Dependent variable (response Factor analysis is a statistical method used to explain variability among observed Variables in terms of fewer unobserved variables called factors In Probability theory and Statistics, correlation, (often measured as a correlation coefficient) indicates the strength and direction of a linear In Statistics, the Pearson product-moment correlation coefficient (sometimes referred to as the MCV or PMCC, and typically denoted by r In Statistics, Spearman's rank correlation coefficient or Spearman's rho, named after Charles Spearman and often denoted by the Greek letter \rho In Statistics, Signal processing, and many other fields a time series is a sequence of Data points measured typically at successive times spaced at (often Technical terminology is the specialized Vocabulary of a field These disciplines include:

Statistics form a key basis tool in business and manufacturing as well. Actuarial science is the discipline that applies mathematical and statistical methods to assess risk in the Insurance and Finance Biostatistics (a Portmanteau word made from biology and statistics sometimes referred to as biometry or biometrics) is the application of Statistics In Statistics, bootstrapping is a modern computer-intensive general purpose approach to Statistical inference, falling within a broader class of resampling In Statistics, resampling is any of a variety of methods for doing one of the following Estimating the precision of sample Statistics (medians variances Business statistics is the science of good decision making in the face of uncertainty and is used in many disciplines such as Financial analysis, Econometrics, Data analysis is the process of looking at and summarizing Data with the intent to extract useful Information and develop conclusions Data mining is the process of Sorting through large amounts of data and picking out relevant information Pattern recognition is a sub-topic of Machine learning. It is "the act of taking in raw data and taking an action based on the category of the data" Demography is the statistical study of all Populations. It can be a very general science that can be applied to any kind of dynamic population that is one that changes over Economic statistics is a branch of Applied statistics focusing on the collection processing compilation and dissemination of statistics concerning the Economy of Energy statistics refers to collecting compiling analyzing and disseminating data on commodities such as Coal, Crude oil, Natural gas, Electricity Engineering statistics is a branch of Statistics that has several subtopics which are particular to Engineering: (DOE or Design of experiments Statistics is a mathematical science pertaining to the collection analysis interpretation or explanation and presentation of Data. Epidemiology is the study of factors affecting the Health and Illness of populations and serves as the foundation and Logic of interventions made in the Geography (from Greek γεωγραφία - geografia) is the study of the Earth and its lands features inhabitants and phenomena In Statistics, spatial analysis or spatial statistics includes any of the formal techniques which study entities using their Topological, Geometric Image processing is any form of Signal processing for which the input is an image such as photographs or frames of video the output of image processing can be either an image Multivariate statistics or multivariate analysis in Statistics describes a collection of procedures which involve observation and analysis of more than Psychological statistics is the application of Statistics to Psychology. In the vernacular quality can mean a high degree of excellence (“a quality product” a degree of excellence or the lack of it (“work of average quality” or a property of Social statistics is the use of statistical measurement systems to study Human behavior in a social environment Statistical literacy is a term used to describe an individual's or group's ability to understand Statistics. Statistical models are used in Applied statistics. Three notions are sufficient to describe all statistical models Statistical surveys are used to collect quantitative information about items in a population Chemometrics is the application of mathematical or statistical methods to chemical data Analytical chemistry is the study of the Chemical composition of natural and artificial Materials. Chemical engineering is the branch of Engineering that deals with the application of Physical science (e Structured data analysis is the statistical data analysis of structured data Survival analysis is a branch of Statistics which deals with death in biological organisms and failure in mechanical systems Reliability engineering is an Engineering field that deals with the study of Reliability: the ability of a System or component to perform its required Statistics play an important role in summarizing Baseball performance and evaluating players in the Sport. Cricket is a Sport that generates a large number of Statistics. It is used to understand measurement systems variability, control processes (as in statistical process control or SPC), for summarizing data, and to make data-driven decisions. Statistical Process Control (SPC is an effective method of monitoring a process through the use of Control charts Control charts enable the use of objective criteria for distinguishing In these roles, it is a key tool, and perhaps the only reliable tool.

## Statistical computing

The rapid and sustained increases in computing power starting from the second half of the 20th century have had a substantial impact on the practice of statistical science. Early statistical models were almost always from the class of linear models, but powerful computers, coupled with suitable numerical algorithms, caused an increased interest in nonlinear models (especially neural networks and decision trees) as well as the creation of new types, such as generalised linear models and multilevel models. In Statistics the linear model is given by Y = X \beta + \varepsilon where Y is an n ×1 column vector of random In Mathematics, Computing, Linguistics and related subjects an algorithm is a sequence of finite instructions often used for Calculation In statistics nonlinear regression is a form of Regression analysis in which observational data are modeled by a function which is a nonlinear combination of the model parameters Traditionally the term neural network had been used to refer to a network or circuit of biological neurons. In Operations research, specifically in Decision analysis, a decision tree (or tree diagram is a decision support tool that uses a graph or In Statistics, the generalized linear model ( GLM) is a flexible generalization of ordinary least squares regression. Multilevel models (also hierarchical linear models, generalized linear mixed models, nested models, mixed models, random coefficient

Increased computing power has also led to the growing popularity of computationally-intensive methods based on resampling, such as permutation tests and the bootstrap, while techniques such as Gibbs sampling have made Bayesian methods more feasible. In Statistics, resampling is any of a variety of methods for doing one of the following Estimating the precision of sample Statistics (medians variances In Statistics, bootstrapping is a modern computer-intensive general purpose approach to Statistical inference, falling within a broader class of resampling In Mathematics and Physics, Gibbs sampling is an Algorithm to generate a sequence of samples from the joint probability distribution of two or The computer revolution has implications for the future of statistics with new emphasis on "experimental" and "empirical" statistics. A large number of both general and special purpose statistical software are now available. A statistical package is a suite of Computer programs that are specialised for statistical analysis.

## Misuse

Main article: Misuse of statistics

There is a general perception that statistical knowledge is all-too-frequently intentionally misused by finding ways to interpret only the data that are favorable to the presenter. A misuse of statistics occurs when a statistical argument asserts a falsehood A misuse of statistics occurs when a statistical argument asserts a falsehood A famous saying attributed to Benjamin Disraeli is, "There are three kinds of lies: lies, damned lies, and statistics"; and Harvard President Lawrence Lowell wrote in 1909 that statistics, "like veal pies, are good if you know the person that made them, and are sure of the ingredients". Benjamin Disraeli 1st Earl of Beaconsfield, KG, PC, FRS (born Benjamin D'Israeli; 21 December 1804 &ndash 19 April 1881 was " Lies damned lies and statistics " is part of a phrase attributed to Benjamin Disraeli and popularised in the United States by Mark Twain: "There Abbott Lawrence Lowell ( January 1, 1856 &ndash January 6, 1943) was a U

If various studies appear to contradict one another, then the public may come to distrust such studies. For example, one study may suggest that a given diet or activity raises blood pressure, while another may suggest that it lowers blood pressure. Blood pressure is also the title of a short story by Damon Runyan in Guys and Dolls and Other Stories The discrepancy can arise from subtle variations in experimental design, such as differences in the patient groups or research protocols, that are not easily understood by the non-expert. (Media reports sometimes omit this vital contextual information entirely. )

By choosing (or rejecting, or modifying) a certain sample, results can be manipulated. Such manipulations need not be malicious or devious; they can arise from unintentional biases of the researcher. The graphs used to summarize data can also be misleading.

Deeper criticisms come from the fact that the hypothesis testing approach, widely used and in many cases required by law or regulation, forces one hypothesis (the null hypothesis) to be "favored", and can also seem to exaggerate the importance of minor differences in large studies. See also Statistical hypothesis testing In Statistics, a null hypothesis ( H 0 is a plausible hypothesis (scenario which may explain A difference that is highly statistically significant can still be of no practical significance. (See criticism of hypothesis testing and controversy over the null hypothesis. A statistical hypothesis test is a method of making statistical decisions using experimental data See also Statistical hypothesis testing In Statistics, a null hypothesis ( H 0 is a plausible hypothesis (scenario which may explain )

One response is by giving a greater emphasis on the p-value than simply reporting whether a hypothesis is rejected at the given level of significance. In statistical Hypothesis testing the p-value is the Probability of obtaining a result at least as extreme as the one that was actually observed given The p-value, however, does not indicate the size of the effect. Another increasingly common approach is to report confidence intervals. In Statistics, a confidence interval (CI is an interval estimate of a Population parameter. Although these are produced from the same calculations as those of hypothesis tests or p-values, they describe both the size of the effect and the uncertainty surrounding it.