## The Average

The *average *or the *mean *of a finite set of numbers is, well, the average. For example, the average of the numbers is given by:

When we have some real-valued variable (a variable with real number values), for example the heights of the students in a class, that we know all about — i.e. we have the data or *statistics *of the variable — we can define it’s average or mean.

### Definition

Let be a real-valued variable with data . The *average *or *mean *of , denoted by is defined by:

## The Expected Value

The aim of this note is to explain what the expected value is so I will just give one example of it and then from this write a definition. The expected value refers to the expected value of a real-valued *random variable * (I’m not sure does LC Project Maths use this term but they should). A random variable is a variable whose outputs are random. For the purposes of this piece we will assume that a random variable takes values in a finite set . We say that takes the value with probability (*I’m assuming we know the basics of probability*).

### Example

Let denote the outcome of a roll of a dice. The possible values of are . The *probability of *rolling a two, say, is ; and indeed for all . We calculate the expectation of , , as

.

In this case the expected value of is the same as the average number on the dice. Is this a coincidence.

### Definition

Suppose that is a random variable taking values in with probabilities . Then the *expected value *of is defined by:

## Is there a link: Empirical probability vs *a priori *probabilities

### Empirical Probability

We know the statistics of a random variable — we know all the data. Suppose the data is given by . Then we can define the probability of the single event that as

.

That is the statistics informs the probability.

.

*A priori *Probability

In an *a priori *(roughly “beforehand”) view of probability we claim that we know the probabilities without knowing any data. Good examples being coin-flipping, dice rolling, card shuffling and lottery games. Using probability we can predict what the statistics will be. For example we know that when coin flipping we will get a head about half of the time.

So in this picture we have that:

.

In the empirical view we have the average or mean. How does the *a priori *picture tell us about the average? Well the expected value is *a priori’s *prediction of what the average will be! For expected value read *expected average** (*i.e. if we take a series of measurements of , say , then *)**. *

### A Justification

Starting in the empirical picture, let be a variable which is observed to take (distinct) values with frequencies (i.e. the outcome occured times). Now the total number of measurements is (i.e. occurs times, occurs times… occurs times so in total we have measurements.) So in the notation of the section on empirical probability

.

Therefore, the average of is:

.

Now assign empirical probabilities to and view it as a random variable (here ):

.

What about ? Well from the definition:

So, in conclusion, we can think of the expected value of as the *expected average *of were we to take a number of measurements of , and take the average of these outcomes.

## Leave a comment

Comments feed for this article