# Mean

There are several kinds of means in various branches of mathematics (especially statistics).

For a data set, the arithmetic mean, also called the mathematical expectation or average, is the central value of a discrete set of numbers: specifically, the sum of the values divided by the number of values. The arithmetic mean of a set of numbers x1, x2, ..., xn is typically denoted by ${\displaystyle {\bar {x}}}$, pronounced "x bar". If the data set were based on a series of observations obtained by sampling from a statistical population, the arithmetic mean is the sample mean (denoted ${\displaystyle {\bar {x}}}$) to distinguish it from the mean of the underlying distribution, the population mean (denoted ${\displaystyle \mu }$ or ${\displaystyle \mu _{x}}$).[1] Pronounced "mew" /'mjuː/.

In probability and statistics, the population mean, or expected value, are a measure of the central tendency either of a probability distribution or of the random variable characterized by that distribution.[2] In the case of a discrete probability distribution of a random variable X, the mean is equal to the sum over every possible value weighted by the probability of that value; that is, it is computed by taking the product of each possible value x of X and its probability p(x), and then adding all these products together, giving ${\displaystyle \mu =\sum xp(x)}$.[3] An analogous formula applies to the case of a continuous probability distribution. Not every probability distribution has a defined mean; see the Cauchy distribution for an example. Moreover, for some distributions the mean is infinite.

For a finite population, the population mean of a property is equal to the arithmetic mean of the given property while considering every member of the population. For example, the population mean height is equal to the sum of the heights of every individual divided by the total number of individuals. The sample mean may differ from the population mean, especially for small samples. The law of large numbers dictates that the larger the size of the sample, the more likely it is that the sample mean will be close to the population mean.[4]

Outside probability and statistics, a wide range of other notions of "mean" are often used in geometry and analysis; examples are given below.

## Types of mean

### Pythagorean means

#### Arithmetic mean (AM)

The arithmetic mean (or simply mean) of a sample ${\displaystyle x_{1},x_{2},\ldots ,x_{n}}$, usually denoted by ${\displaystyle {\bar {x}}}$, is the sum of the sampled values divided by the number of items in the example

${\displaystyle {\bar {x}}={\frac {1}{n}}\left(\sum _{i=1}^{n}{x_{i}}\right)={\frac {x_{1}+x_{2}+\cdots +x_{n}}{n}}}$

For example, the arithmetic mean of five values: 4, 36, 45, 50, 75 is:

${\displaystyle {\frac {4+36+45+50+75}{5}}={\frac {210}{5}}=42.}$

#### Geometric mean (GM)

The geometric mean is an average that is useful for sets of positive numbers that are interpreted according to their product and not their sum (as is the case with the arithmetic mean); e.g., rates of growth.

${\displaystyle {\bar {x}}=\left(\prod _{i=1}^{n}{x_{i}}\right)^{\frac {1}{n}}=\left(x_{1}x_{2}\cdots x_{n}\right)^{\frac {1}{n}}}$

For example, the geometric mean of five values: 4, 36, 45, 50, 75 is:

${\displaystyle (4\times 36\times 45\times 50\times 75)^{\frac {1}{5}}={\sqrt[{5}]{24\;300\;000}}=30.}$

#### Harmonic mean (HM)

The harmonic mean is an average which is useful for sets of numbers which are defined in relation to some unit, for example speed (distance per unit of time).

${\displaystyle {\bar {x}}=n\left(\sum _{i=1}^{n}{\frac {1}{x_{i}}}\right)^{-1}}$

For example, the harmonic mean of the five values: 4, 36, 45, 50, 75 is

${\displaystyle {\frac {5}{{\tfrac {1}{4}}+{\tfrac {1}{36}}+{\tfrac {1}{45}}+{\tfrac {1}{50}}+{\tfrac {1}{75}}}}={\frac {5}{\;{\tfrac {1}{3}}\;}}=15.}$

#### Relationship between AM, GM, and HM

AM, GM, and HM satisfy these inequalities:

${\displaystyle \mathrm {AM} \geq \mathrm {GM} \geq \mathrm {HM} \,}$

Equality holds if and only if all the elements of the given sample are equal.

### Statistical location

Comparison of the arithmetic mean, median and mode of two skewed (log-normal) distributions.
Geometric visualization of the mode, median and mean of an arbitrary probability density function.[5]

In descriptive statistics, the mean may be confused with the median, mode or mid-range, as any of these may be called an "average" (more formally, a measure of central tendency). The mean of a set of observations is the arithmetic average of the values; however, for skewed distributions, the mean is not necessarily the same as the middle value (median), or the most likely value (mode). For example, mean income is typically skewed upwards by a small number of people with very large incomes, so that the majority have an income lower than the mean. By contrast, the median income is the level at which half the population is below and half is above. The mode income is the most likely income and favors the larger number of people with lower incomes. While the median and mode are often more intuitive measures for such skewed data, many skewed distributions are in fact best described by their mean, including the exponential and Poisson distributions.

### Mean of a probability distribution

The mean of a probability distribution is the long-run arithmetic average value of a random variable having that distribution. In this context, it is also known as the expected value. For a discrete probability distribution, the mean is given by ${\displaystyle \textstyle \sum xP(x)}$, where the sum is taken over all possible values of the random variable and ${\displaystyle P(x)}$ is the probability mass function. For a continuous distribution,the mean is ${\displaystyle \textstyle \int _{-\infty }^{\infty }xf(x)\,dx}$, where ${\displaystyle f(x)}$ is the probability density function. In all cases, including those in which the distribution is neither discrete nor continuous, the mean is the Lebesgue integral of the random variable with respect to its probability measure. The mean need not exist or be finite; for some probability distributions the mean is infinite (+∞ or −∞), while others have no mean.

### Generalized means

#### Power mean

The generalized mean, also known as the power mean or Hölder mean, is an abstraction of the quadratic, arithmetic, geometric and harmonic means. It is defined for a set of n positive numbers xi by

${\displaystyle {\bar {x}}(m)=\left({\frac {1}{n}}\sum _{i=1}^{n}x_{i}^{m}\right)^{\frac {1}{m}}}$

By choosing different values for the parameter m, the following types of means are obtained:

 ${\displaystyle m\rightarrow \infty }$ maximum of ${\displaystyle x_{i}}$ ${\displaystyle m=2}$ quadratic mean ${\displaystyle m=1}$ arithmetic mean ${\displaystyle m\rightarrow 0}$ geometric mean ${\displaystyle m=-1}$ harmonic mean ${\displaystyle m\rightarrow -\infty }$ minimum of ${\displaystyle x_{i}}$

#### ƒ-mean

This can be generalized further as the generalized ƒ-mean

${\displaystyle {\bar {x}}=f^{-1}\left({{\frac {1}{n}}\sum _{i=1}^{n}{f\left(x_{i}\right)}}\right)}$

and again a suitable choice of an invertible ƒ will give

 ${\displaystyle f(x)=x}$ arithmetic mean, ${\displaystyle f(x)={\frac {1}{x}}}$ harmonic mean, ${\displaystyle f(x)=x^{m}}$ power mean, ${\displaystyle f(x)=\ln(x)}$ geometric mean.

### Weighted arithmetic mean

The weighted arithmetic mean (or weighted average) is used if one wants to combine average values from samples of the same population with different sample sizes:

${\displaystyle {\bar {x}}={\frac {\sum _{i=1}^{n}{w_{i}x_{i}}}{\sum _{i=1}^{n}w_{i}}}.}$

The weights ${\displaystyle w_{i}}$ represent the sizes of the different samples. In other applications, they represent a measure for the reliability of the influence upon the mean by the respective values.

### Truncated mean

Sometimes a set of numbers might contain outliers, i.e., data values which are much lower or much higher than the others. Often, outliers are erroneous data caused by artifacts. In this case, one can use a truncated mean. It involves discarding given parts of the data at the top or the bottom end, typically an equal amount at each end and then taking the arithmetic mean of the remaining data. The number of values removed is indicated as a percentage of the total number of values.

### Interquartile mean

The interquartile mean is a specific example of a truncated mean. It is simply the arithmetic mean after removing the lowest and the highest quarter of values.

${\displaystyle {\bar {x}}={\frac {2}{n}}\;\sum _{i={\frac {n}{4}}+1}^{{\frac {3}{4}}n}\!\!x_{i}}$

assuming the values have been ordered, so is simply a specific example of a weighted mean for a specific set of weights.

### Mean of a function

In some circumstances mathematicians may calculate a mean of an infinite (even an uncountable) set of values. This can happen when calculating the mean value ${\displaystyle y_{\text{ave}}}$ of a function ${\displaystyle f(x)}$. Intuitively this can be thought of as calculating the area under a section of a curve and then dividing by the length of that section. This can be done crudely by counting squares on graph paper or more precisely by integration. The integration formula is written as:

${\displaystyle y_{\text{ave}}(a,b)={\frac {1}{b-a}}\int \limits _{a}^{b}\!f(x)\,dx}$

Care must be taken to make sure that the integral converges. But the mean may be finite even if the function itself tends to infinity at some points.

### Mean of angles and cyclical quantities

Angles, times of day, and other cyclical quantities require modular arithmetic to add and otherwise combine numbers. In all these situations, there will not be a unique mean. For example, the times an hour before and after midnight are equidistant to both midnight and noon. It is also possible that no mean exists. Consider a color wheel -- there is no mean to the set of all colors. In these situations, you must decide which mean is most useful. You can do this by adjusting the values before averaging, or by using a specialized approach for the mean of circular quantities.

### Fréchet mean

The Fréchet mean gives a manner for determining the "center" of a mass distribution on a surface or, more generally, Riemannian manifold. Unlike many other means, the Fréchet mean is defined on a space whose elements cannot necessarily be added together or multiplied by scalars. It is sometimes also known as the Karcher mean (named after Hermann Karcher).

## Distribution of the sample mean

The arithmetic mean of a population, or population mean, is often denoted μ. The sample mean ${\displaystyle {\bar {x}}}$ (the arithmetic mean of a sample of values drawn from the population) makes a good estimator of the population mean, as its expected value is equal to the population mean (that is, it is an unbiased estimator). The sample mean is a random variable, not a constant, since its calculated value will randomly differ depending on which members of the population are sampled, and consequently it will have its own distribution. For a random sample of n independent observations, the expected value of the sample mean is

${\displaystyle \operatorname {E} {\bar {x}}=\mu }$

and the variance of the sample mean is

${\displaystyle \operatorname {var} ({\bar {x}})={\frac {\sigma ^{2}}{n}}.}$

If the population is normally distributed, then the sample mean is normally distributed:

${\displaystyle {\bar {x}}\thicksim N\left\{\mu ,{\frac {\sigma ^{2}}{n}}\right\}.}$

If the population is not normally distributed, the sample mean is nonetheless approximately normally distributed if n is large and σ2/n < +∞. This follows from the central limit theorem.

The mean of a list is all of the numbers added together and divided by the amount of numbers

## References

1. ^ Underhill, L.G.; Bradfield d. (1998) Introstat, Juta and Company Ltd. ISBN 0-7021-3838-X p. 181
2. ^ Feller, William (1950). Introduction to Probability Theory and its Applications, Vol I. Wiley. p. 221. ISBN 0471257087.
3. ^ Elementary Statistics by Robert R. Johnson and Patricia J. Kuby, p. 279
4. ^ Schaum's Outline of Theory and Problems of Probability by Seymour Lipschutz and Marc Lipson, p. 141
5. ^ "AP Statistics Review - Density Curves and the Normal Distributions". Retrieved 16 March 2015.
Amplitude

The amplitude of a periodic variable is a measure of its change over a single period (such as time or spatial period). There are various definitions of amplitude (see below), which are all functions of the magnitude of the difference between the variable's extreme values. In older texts the phase is sometimes called the amplitude.

Arithmetic mean

In mathematics and statistics, the arithmetic mean ( , stress on third syllable of "arithmetic"), or simply the mean or average when the context is clear, is the sum of a collection of numbers divided by the count of numbers in the collection. The collection is often a set of results of an experiment or an observational study, or frequently a set of results from a survey. The term "arithmetic mean" is preferred in some contexts in mathematics and statistics because it helps distinguish it from other means, such as the geometric mean and the harmonic mean.

In addition to mathematics and statistics, the arithmetic mean is used frequently in many diverse fields such as economics, anthropology, and history, and it is used in almost every academic field to some extent. For example, per capita income is the arithmetic average income of a nation's population.

While the arithmetic mean is often used to report central tendencies, it is not a robust statistic, meaning that it is greatly influenced by outliers (values that are very much larger or smaller than most of the values). Notably, for skewed distributions, such as the distribution of income for which a few people's incomes are substantially greater than most people's, the arithmetic mean may not coincide with one's notion of "middle", and robust statistics, such as the median, may be a better description of central tendency.

Coordinated Universal Time

Earth

Earth is the third planet from the Sun, and the only astronomical object known to harbor life. According to radiometric dating and other sources of evidence, Earth formed over 4.5 billion years ago. Earth's gravity interacts with other objects in space, especially the Sun and the Moon, Earth's only natural satellite. Earth orbits around the Sun in 365.26 days, a period known as an Earth year. During this time, Earth rotates about its axis about 366.26 times.Earth's axis of rotation is tilted with respect to its orbital plane, producing seasons on Earth. The gravitational interaction between Earth and the Moon causes tides, stabilizes Earth's orientation on its axis, and gradually slows its rotation. Earth is the densest planet in the Solar System and the largest and most massive of the four terrestrial planets.Earth's lithosphere is divided into several rigid tectonic plates that migrate across the surface over many millions of years. About 71% of Earth's surface is covered with water, mostly by oceans. The remaining 29% is land consisting of continents and islands that together contain many lakes, rivers and other sources of water that contribute to the hydrosphere. The majority of Earth's polar regions are covered in ice, including the Antarctic ice sheet and the sea ice of the Arctic ice pack. Earth's interior remains active with a solid iron inner core, a liquid outer core that generates the Earth's magnetic field, and a convecting mantle that drives plate tectonics.

Within the first billion years of Earth's history, life appeared in the oceans and began to affect the Earth's atmosphere and surface, leading to the proliferation of aerobic and anaerobic organisms. Some geological evidence indicates that life may have arisen as early as 4.1 billion years ago. Since then, the combination of Earth's distance from the Sun, physical properties, and geological history have allowed life to evolve and thrive. In the history of the Earth, biodiversity has gone through long periods of expansion, occasionally punctuated by mass extinction events. Over 99% of all species that ever lived on Earth are extinct. Estimates of the number of species on Earth today vary widely; most species have not been described. Over 7.6 billion humans live on Earth and depend on its biosphere and natural resources for their survival. Humans have developed diverse societies and cultures; politically, the world has about 200 sovereign states.

Geometric mean

In mathematics, the geometric mean is a mean or average, which indicates the central tendency or typical value of a set of numbers by using the product of their values (as opposed to the arithmetic mean which uses their sum). The geometric mean is defined as the nth root of the product of n numbers, i.e., for a set of numbers x1, x2, ..., xn, the geometric mean is defined as

${\displaystyle \left(\prod _{i=1}^{n}x_{i}\right)^{\frac {1}{n}}={\sqrt[{n}]{x_{1}x_{2}\cdots x_{n}}}}$

For instance, the geometric mean of two numbers, say 2 and 8, is just the square root of their product, that is, ${\displaystyle {\sqrt {2\cdot 8}}=4}$. As another example, the geometric mean of the three numbers 4, 1, and 1/32 is the cube root of their product (1/8), which is 1/2, that is, ${\displaystyle {\sqrt[{3}]{4\cdot 1\cdot 1/32}}=1/2}$.

A geometric mean is often used when comparing different items—finding a single "figure of merit" for these items—when each item has multiple properties that have different numeric ranges. For example, the geometric mean can give a meaningful value to compare two companies which are each rated at 0 to 5 for their environmental sustainability, and are rated at 0 to 100 for their financial viability. If an arithmetic mean were used instead of a geometric mean, the financial viability would have greater weight because its numeric range is larger. That is, a small percentage change in the financial rating (e.g. going from 80 to 90) makes a much larger difference in the arithmetic mean than a large percentage change in environmental sustainability (e.g. going from 2 to 5). The use of a geometric mean normalizes the differently-ranged values, meaning a given percentage change in any of the properties has the same effect on the geometric mean. So, a 20% change in environmental sustainability from 4 to 4.8 has the same effect on the geometric mean as a 20% change in financial viability from 60 to 72.

The geometric mean can be understood in terms of geometry. The geometric mean of two numbers, ${\displaystyle a}$ and ${\displaystyle b}$, is the length of one side of a square whose area is equal to the area of a rectangle with sides of lengths ${\displaystyle a}$ and ${\displaystyle b}$. Similarly, the geometric mean of three numbers, ${\displaystyle a}$, ${\displaystyle b}$, and ${\displaystyle c}$, is the length of one edge of a cube whose volume is the same as that of a cuboid with sides whose lengths are equal to the three given numbers.

The geometric mean applies only to positive numbers. It is also often used for a set of numbers whose values are meant to be multiplied together or are exponential in nature, such as data on the growth of the human population or interest rates of a financial investment.

The geometric mean is also one of the three classical Pythagorean means, together with the aforementioned arithmetic mean and the harmonic mean. For all positive data sets containing at least one pair of unequal values, the harmonic mean is always the least of the three means, while the arithmetic mean is always the greatest of the three and the geometric mean is always in between (see Inequality of arithmetic and geometric means.)

Golden ratio

In mathematics, two quantities are in the golden ratio if their ratio is the same as the ratio of their sum to the larger of the two quantities. The figure on the right illustrates the geometric relationship. Expressed algebraically, for quantities a and b with a > b > 0,

${\displaystyle {\frac {a+b}{a}}={\frac {a}{b}}\ {\stackrel {\text{def}}{=}}\ \varphi ,}$

where the Greek letter phi (${\displaystyle \varphi }$ or ${\displaystyle \phi }$) represents the golden ratio. It is an irrational number that is a solution to the quadratic equation ${\displaystyle x^{2}-x-1=0}$, with a value of:

${\displaystyle \varphi ={\frac {1+{\sqrt {5}}}{2}}=1.6180339887\ldots .}$

The golden ratio is also called the golden mean or golden section (Latin: sectio aurea). Other names include extreme and mean ratio, medial section, divine proportion, divine section (Latin: sectio divina), golden proportion, golden cut, and golden number.

Mathematicians since Euclid have studied the properties of the golden ratio, including its appearance in the dimensions of a regular pentagon and in a golden rectangle, which may be cut into a square and a smaller rectangle with the same aspect ratio. The golden ratio has also been used to analyze the proportions of natural objects as well as man-made systems such as financial markets, in some cases based on dubious fits to data. The golden ratio appears in some patterns in nature, including the spiral arrangement of leaves and other plant parts.

Some twentieth-century artists and architects, including Le Corbusier and Salvador Dalí, have proportioned their works to approximate the golden ratio—especially in the form of the golden rectangle, in which the ratio of the longer side to the shorter is the golden ratio—believing this proportion to be aesthetically pleasing.

Greenwich Mean Time

Greenwich Mean Time (GMT) is the mean solar time at the Royal Observatory in Greenwich, London, reckoned from midnight. At different times in the past, it has been calculated in different ways, including being calculated from noon; as a consequence, it cannot be used to specify a precise time unless a context is given.

English speakers often use GMT as a synonym for Coordinated Universal Time (UTC). For navigation, it is considered equivalent to UT1 (the modern form of mean solar time at 0° longitude); but this meaning can differ from UTC by up to 0.9 s. The term GMT should not thus be used for technical purposes.Because of Earth's uneven speed in its elliptical orbit and its axial tilt, noon (12:00:00) GMT is rarely the exact moment the sun crosses the Greenwich meridian and reaches its highest point in the sky there. This event may occur up to 16 minutes before or after noon GMT, a discrepancy calculated by the equation of time. Noon GMT is the annual average (i.e. "mean") moment of this event, which accounts for the word "mean" in "Greenwich Mean Time".

Originally, astronomers considered a GMT day to start at noon, while for almost everyone else it started at midnight. To avoid confusion, the name Universal Time was introduced to denote GMT as counted from midnight. Astronomers preferred the old convention to simplify their observational data, so that each night was logged under a single calendar date. Today Universal Time usually refers to UTC or UT1.The term "GMT" is especially used by bodies connected with the United Kingdom, such as the BBC World Service, the Royal Navy, the Met Office and others particularly in Arab countries, such as the Middle East Broadcasting Centre and OSN. It is a term commonly used in the United Kingdom and countries of the Commonwealth, including Australia, New Zealand, South Africa, India, Pakistan, Bangladesh and Malaysia; and in many other countries of the Eastern Hemisphere.

Mean Girls

Mean Girls is an American teen comedy film directed by Mark Waters, written by Tina Fey, and released in April 2004. The film, which stars Lindsay Lohan, Rachel McAdams, Tim Meadows, Ana Gasteyer, Amy Poehler and Fey, is partially based on Rosalind Wiseman's 2002 non-fiction self-help book, Queen Bees and Wannabes, which describes female high school social cliques and the damaging effects they can have on girls. Fey also drew from her own experience at Upper Darby High School as an inspiration for some of the concepts in the film. The movie introduced Amanda Seyfried in her film debut.

Saturday Night Live creator Lorne Michaels produced the film; Tina Fey, screenwriter and co-star of the film, was a long-term cast member and writer for SNL. Although set in Evanston, Illinois (a Chicago suburb), the film was mostly shot in Toronto, Canada. The film marks Lohan's second collaboration with director Waters, the first being Freaky Friday, released a year earlier.

Released on April 30, 2004, the film grossed \$129 million worldwide and developed a cult following. A direct-to-video sequel, Mean Girls 2, premiered on ABC Family (now Freeform) on January 23, 2011. The musical adaptation of Mean Girls premiered on Broadway in March 2018.

Metres above sea level

"Feet above sea level" redirects here.Metres above mean sea level (MAMSL) or simply metres above sea level (MASL or m a.s.l.) is a standard metric measurement in metres of vertical distance (height, elevation or altitude) of a location in reference to a historic mean sea level taken as a vertical datum. Mean sea levels are affected by climate change and other factors and change over time. For this and other reasons, recorded measurements of elevation above sea level might differ from the actual elevation of a given location over sea level at a given moment.

Mode (statistics)

The mode of a set of data values is the value that appears most often. If X is a discrete random variable, the mode is the value x (i.e, X = x) at which the probability mass function takes its maximum value. In other words, it is the value that is most likely to be sampled.

Like the statistical mean and median, the mode is a way of expressing, in a (usually) single number, important information about a random variable or a population. The numerical value of the mode is the same as that of the mean and median in a normal distribution, and it may be very different in highly skewed distributions.

The mode is not necessarily unique to a given discrete distribution, since the probability mass function may take the same maximum value at several points x1, x2, etc. The most extreme case occurs in uniform distributions, where all values occur equally frequently.

When the probability density function of a continuous distribution has multiple local maxima it is common to refer to all of the local maxima as modes of the distribution. Such a continuous distribution is called multimodal (as opposed to unimodal). A mode of a continuous probability distribution is often considered to be any value x at which its probability density function has a locally maximum value, so any peak is a mode.In symmetric unimodal distributions, such as the normal distribution, the mean (if defined), median and mode all coincide. For samples, if it is known that they are drawn from a symmetric unimodal distribution, the sample mean can be used as an estimate of the population mode.

Normal distribution

In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. A random variable with a Gaussian distribution is said to be normally distributed and is called a normal deviate.

The normal distribution is useful because of the central limit theorem. In its most general form, under some conditions (which include finite variance), it states that averages of samples of observations of random variables independently drawn from independent distributions converge in distribution to the normal, that is, they become normally distributed when the number of observations is sufficiently large. Physical quantities that are expected to be the sum of many independent processes (such as measurement errors) often have distributions that are nearly normal. Moreover, many results and methods (such as propagation of uncertainty and least squares parameter fitting) can be derived analytically in explicit form when the relevant variables are normally distributed.

The normal distribution is sometimes informally called the bell curve. However, many other distributions are bell-shaped (such as the Cauchy, Student's t-, and logistic distributions).

The probability density of the normal distribution is

${\displaystyle f(x\mid \mu ,\sigma ^{2})={\frac {1}{\sqrt {2\pi \sigma ^{2}}}}e^{-{\frac {(x-\mu )^{2}}{2\sigma ^{2}}}}}$

where

Root mean square

In mathematics and its applications, the root mean square (RMS or rms) is defined as the square root of the mean square (the arithmetic mean of the squares of a set of numbers).

The RMS is also known as the quadratic mean and is a particular case of the generalized mean with exponent 2. RMS can also be defined for a continuously varying function in terms of an integral of the squares of the instantaneous values during a cycle.

For alternating electric current, RMS is equal to the value of the direct current that would produce the same average power dissipation in a resistive load.In estimation theory, the root mean square error of an estimator is a measure of the imperfection of the fit of the estimator to the data.

Sea level

Mean sea level (MSL) (often shortened to sea level) is an average level of the surface of one or more of Earth's oceans from which heights such as elevation may be measured. MSL is a type of vertical datum – a standardised geodetic datum – that is used, for example, as a chart datum in cartography and marine navigation, or, in aviation, as the standard sea level at which atmospheric pressure is measured to calibrate altitude and, consequently, aircraft flight levels. A common and relatively straightforward mean sea-level standard is the midpoint between a mean low and mean high tide at a particular location.Sea levels can be affected by many factors and are known to have varied greatly over geological time scales. However 20th century and current millennium sea level rise is caused by global warming, and careful measurement of variations in MSL can offer insights into ongoing climate change.The term above sea level generally refers to above mean sea level (AMSL).

Standard deviation

In statistics, the standard deviation (SD, also represented by the lower case Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values. A low standard deviation indicates that the data points tend to be close to the mean (also called the expected value) of the set, while a high standard deviation indicates that the data points are spread out over a wider range of values.

The standard deviation of a random variable, statistical population, data set, or probability distribution is the square root of its variance. It is algebraically simpler, though in practice less robust, than the average absolute deviation.

A useful property of the standard deviation is that, unlike the variance, it is expressed in the same units as the data.

In addition to expressing the variability of a population, the standard deviation is commonly used to measure confidence in statistical conclusions. For example, the margin of error in polling data is determined by calculating the expected standard deviation in the results if the same poll were to be conducted multiple times. This derivation of a standard deviation is often called the "standard error" of the estimate or "standard error of the mean" when referring to a mean. It is computed as the standard deviation of all the means that would be computed from that population if an infinite number of samples were drawn and a mean for each sample were computed.

It is very important to note that the standard deviation of a population and the standard error of a statistic derived from that population (such as the mean) are quite different but related (related by the inverse of the square root of the number of observations). The reported margin of error of a poll is computed from the standard error of the mean (or alternatively from the product of the standard deviation of the population and the inverse of the square root of the sample size, which is the same thing) and is typically about twice the standard deviation—the half-width of a 95 percent confidence interval.

In science, many researchers report the standard deviation of experimental data, and only effects that fall much farther than two standard deviations away from what would have been expected are considered statistically significant—normal random error or variation in the measurements is in this way distinguished from likely genuine effects or associations. The standard deviation is also important in finance, where the standard deviation on the rate of return on an investment is a measure of the volatility of the investment.

When only a sample of data from a population is available, the term standard deviation of the sample or sample standard deviation can refer to either the above-mentioned quantity as applied to those data or to a modified quantity that is an unbiased estimate of the population standard deviation (the standard deviation of the entire population).

Standard error

The standard error (SE) of a statistic (usually an estimate of a parameter) is the standard deviation of its sampling distribution or an estimate of that standard deviation. If the parameter or the statistic is the mean, it is called the standard error of the mean (SEM).

The sampling distribution of a population mean is generated by repeated sampling and recording of the means obtained. This forms a distribution of different means, and this distribution has its own mean and variance. Mathematically, the variance of the sampling distribution obtained is equal to the variance of the population divided by the sample size. This is because as the sample size increases, sample means cluster more closely around the population mean.

Therefore, the relationship between the standard error and the standard deviation is such that, for a given sample size, the standard error equals the standard deviation divided by the square root of the sample size. In other words, the standard error of the mean is a measure of the dispersion of sample means around the population mean.

In regression analysis, the term "standard error" refers either to the square root of the reduced chi-squared statistic or the standard error for a particular regression coefficient (as used in, e.g., confidence intervals).

Standard score

In statistics, the standard score is the signed fractional number of standard deviations by which the value of an observation or data point is above the mean value of what is being observed or measured. Observed values above the mean have positive standard scores, while values below the mean have negative standard scores.

It is calculated by subtracting the population mean from an individual raw score and then dividing the difference by the population standard deviation. It is a dimensionless quantity. This conversion process is called standardizing or normalizing (however, "normalizing" can refer to many types of ratios; see normalization for more).

Standard scores are also called z-values, z-scores, normal scores, and standardized variables. They are most frequently used to compare an observation to a theoretical deviate, such as a standard normal deviate.

Computing a z-score requires knowing the mean and standard deviation of the complete population to which a data point belongs; if one only has a sample of observations from the population, then the analogous computation with sample mean and sample standard deviation yields the t-statistic.

Variance

In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean. Informally, it measures how far a set of (random) numbers are spread out from their average value. Variance has a central role in statistics, where some ideas that use it include descriptive statistics, statistical inference, hypothesis testing, goodness of fit, and Monte Carlo sampling. Variance is an important tool in the sciences, where statistical analysis of data is common. The variance is the square of the standard deviation, the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by ${\displaystyle \sigma ^{2}}$, ${\displaystyle s^{2}}$, or ${\displaystyle \operatorname {Var} (X)}$.

Weighted arithmetic mean

The weighted arithmetic mean is similar to an ordinary arithmetic mean (the most common type of average), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others. The notion of weighted mean plays a role in descriptive statistics and also occurs in a more general form in several other areas of mathematics.

If all the weights are equal, then the weighted mean is the same as the arithmetic mean. While weighted means generally behave in a similar fashion to arithmetic means, they do have a few counterintuitive properties, as captured for instance in Simpson's paradox.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.