The Average

The average or the mean of a finite set of numbers is, well, the average. For example, the average of the (multiset of) numbers \{2,3,4,4,5,7,11,12\} is given by:

\text{average}=\frac{2+3+4+4+5+7+11+12}{8}=\frac{48}{8}=6.

When we have some real-valued variable (a variable with real number values), for example the heights of the students in a class, that we know all about — i.e. we have the data or statistics of the variable — we can define it’s average or mean.

Definition

Let x be a real-valued variable with data \{x_1,x_2,\dots,x_n\}. The average or mean of x, denoted by \bar{x} is defined by:

\bar{x}=\frac{x_1+x_2+\cdots+x_n}{n}=\frac{\sum_{i=1}^nx_i}{n}.

The Expected Value

The aim of this note is to explain what the expected value is so I will just give one example of it and then from this write a definition. The expected value refers to the expected value of a real-valued random variable X (I’m not sure does LC Project Maths use this term but they should). A random variable X is a variable whose outputs are random. For the purposes of this piece we will assume that a random variable X takes values in a finite set \{x_1,x_2,\dots,x_n\}. We say that X takes the value x_i with probability p_i (I’m assuming we know the basics of probability).

Example

Let X denote the outcome of a roll of a dice. The possible values of X are 1,2,3,4,5,6=\{x_1,x_2,x_3,x_4,x_5,x_6\}. The probability of rolling a two, say, is 1/6=p_2; and indeed p_i=1/6 for all x_i.  We calculate the expectation of X, E(X), as

E(X)=p_1x_1+p_2x_2+p_3x_3+p_4x_4+p_5x_5+p_6x_6

=\frac{1}{6}(1+2+3+4+5+6)=3.5.

In this case the expected value of X is the same as the average number on the dice. Is this a coincidence.

Definition

Suppose that X is a random variable taking values in \{x_1,x_2,\dots,x_n\} with probabilities \{p_1,\dots,p_n\}. Then the expected value of X is defined by:

E(X)=p_1x_1+p_2x_2+\cdots+p_nx_n=\sum_{i=1}^np_ix_i.

Is there a link: Empirical probability vs a priori probabilities

Empirical Probability

We know the statistics of a random variable — we know all the data. Suppose the data is given by S=\{x_1,x_2,\dots,x_n\}.  Then we can define the probability of the single event A that X=x_k as

P(A)=P(X=x_k)=\frac{\text{number of instances of }x_k \text{ in }S}{n}.

That is the statistics informs the probability.

\text{STATISTICS }\rightarrow\text{ PROBABILITIES}.

A priori Probability

In an a priori (roughly “beforehand”) view of probability we claim that we know the probabilities without knowing any data. Good examples being coin-flipping, dice rolling, card shuffling and lottery games. Using probability we can predict what the statistics will be. For example we know that when coin flipping we will get a head about half of the time.

So in this picture we have that:

\text{PROBABILITY }\rightarrow\text{ STATISTICS}.

In the empirical view we have the average or mean. How does the a priori picture tell us about the average? Well the expected value is a priori’s prediction of what the average will be! For expected value read expected average (i.e. if we take a series of measurements of X, say x=\{x_1,x_2,\dots,x_m\}, then  E(X)\approx\bar{x}).

A Justification

Starting in the empirical picture, let x be a variable which is observed to take m (distinct) values \{x_1,\cdots,x_m\} with frequencies \{f_1,f_2,\dots,f_m\} (i.e. the outcome x_i occured f_i times). Now the total number of measurements is f_1+f_2+\cdots+f_m=\sum_i f_i=:N (i.e. x_1 occurs f_1 times, x_2 occurs f_2 times… x_m occurs f_m times so in total we have \sum_i f_i measurements.) So in the notation of the section on empirical probability

S=\{\underbrace{x_1,x_1,\dots,x_1}_{f_1\text{ times}},\underbrace{x_2,\dots,x_2}_{f_2\text{ times}},\dots,\underbrace{x_m,\dots,x_m}_{f_m\text{ times}}\}.

Therefore, the average of x is:

\bar{x}=\frac{f_1x_1+f_2x_2+\cdots+f_nx_n}{f_1+f_2+\cdots+f_n}=\frac{\sum_i f_ix_i}{\sum_i f_i}.

Now assign empirical probabilities to x and view it as a random variable X (here N=\sum_i f_i):

p_k=P(X=x_k)=\frac{\text{number of instances of }x_k \text{ in }S}{N}=\frac{f_i}{\sum_i f_i}.

What about E(X)? Well from the definition:

E(X)=p_1x_1+p_2x_2+\cdots+p_mx_m

=\frac{f_1}{\sum_if_i}x_1+\frac{f_2}{\sum_if_i}x_2+\cdots+\frac{f_m}{\sum_if_i}x_m

=\frac{f_1x_1+\cdots+f_mx_m}{\sum_if_i}=\bar{x} \bullet

So, in conclusion, we can think of the expected value of X as the expected average of X were we to take a number of measurements of X, and take the average of these outcomes.

Advertisement