[Maths Class Notes] on Mean and Variance of Random Variable Pdf for Exam

Statistics is a very important branch of Mathematics with a lot of practical applications. It is important in statistics to know the mean and variance of random variables. Let us try to understand what are random variables, how are they generated, and how to do calculations using random variables. A random variable is defined as variables that assign numerical values to the outcomes of random experiments. Random variables are mainly divided into discrete and continuous random variables. A random variable is termed as a continuous random variable when it can take infinitely many values. Here the value of X is not limited to integer values. Most of our real-life applications make use of continuous random variables. For example, if we are doing experiments with variables such as height, weight, age, etc, then these variables are continuous random variables.

In the article, we have covered the basics of the random variable and have strived to help understand the basic idea of mean and variable and their applications in detail.

Probability Distributions

A probability distribution helps us to make sense of the huge data collected by plotting it against random variables. Depending on the nature of the random variables, the probability distributions also change. Some of the important probability distributions are binomial distributions, Poisson’s distributions, Gaussian distribution, etc. These are classified into continuous or discrete distributions based on the random variables.

The function plotted in these probability distributions are called Probability Distribution Functions or P.D.Fs. Each PDFs is characterized by a mean and variance. Mean is often used synonymously to average, though its meaning might slightly vary according to the nature of the random variable. Variance is the spread of the curve or in other words the deviation of the data from the mean value.

Mean of Discrete Random Variables

In the case of a discrete random variable, the mean implies a weighted average. To understand this let us discuss a couple of examples.

Let us take our previous coin-tossing example. Here the experiment is repeated n times and if asked us to predict the most probable outcome or mean outcome we would answer that there is a 50 percent chance of obtaining either head or a tail. This is because the probability of obtaining both outcomes is equal or in other words, each outcome is of equal weight.

Now consider the following example:

Suppose a person is playing poker. The possible outcomes are to lose 2 coins, break-even, gain 1 coin or gain 5 coins. If the probability distribution is given what will be our mean outcome using this data?

Outcome, X

-2

0

2

5

Probability

0.30

0.40

0.20

0.10

Here we have to bring in the concept of weighted average. Suppose we have n values of measurement with the probability of each value being different. The sum of those values multiplied with their probabilities is called mean or weighted average. For example, suppose from an experiment we obtain 3 energies E1, E2, E3 of values 0, 5, and 3 units with probability equal to [frac{1}{5}],[frac{3}{5}] and [frac{1}{5}] respectively. The mean/average energy = 0 * [frac{1}{5}] + 5 * [frac{3}{5}] + 3 * [frac{1}{5}] = [frac{43}{5}] units.

Now returning to our earlier examples of the poker game.

The mean outcome/ expectation value of the poker game is

E(X) = -2*0.30 + 0*0.40 + 2*0.20 + 5*0.10 = 0.3 coins

Hence let us write a formula for the mean of a discrete random variable.

Suppose pi is the weighted average of xi. Then 

E(X) = ∑xipi

Mean and Variance of Continuous Random Variable

When our data is continuous, then the corresponding random variable and probability distribution will be continuous. The principle of mean and variance remains the same. However, we cannot use the same formula, as when the discrete variables become continuous, the addition will become integration. Hence our formulas for mean and variance will be of the form

E(X) = [int_{-infty}^{infty} xf_{x} (x)dx] , Here [f_{x} (x)] is the probability density function.

Var(X) = [int_{-infty}^{infty} (x-E(X))^{2} xf_{x} (x)dx]

Here the most important term is our probability density function. It contains all the data of our system. Even the shape of our distribution can vary based on the probability distribution function.

It is very difficult to study all the continuous distributions under one title as each of them has their specific characteristics and properties and require individual attention.

Leave a Reply

Your email address will not be published. Required fields are marked *