Suppose that is a discrete random varaible, then the probability mass function of is defined by the function
Bernoulli Random Variable
We say that a random variable is a Bernoulli random varaible iff there is some we denote this symbolically as
Expected Value of a Bernoulli Random Variable
Suppose then
Binomial Distribution
The random varialbe is said to have a binomial distribution on trials, with probability of success per trial iff where
Expected Value of the Binomial Distribution
Suppose that then
Probability Mass Function of the Binomial Distribution
Suppose then
Note that represents the probability of getting exactly successes in independent Bernouilli trials each with the same probability.
Coin Counting
A coin is tossed 12 times and 7 heads occur
Determine the probability that the 7th head occurs on the 12th toss given that there are exactly 7 heads in these 12 tosses
Now determine the probability that the 6th head occurs on the 9th toss given that there are exactly 7 heads in the first 12 tosses
Finally determine the probability that the 2nd head toss occurs on the 4th toss and the 6th head occurs on the 9th toss given that there are exactly 7 heads in the first 12 tosses
Let be the event that the 7th head is on the 12th toss, and the event that there are exactly 7 heads in these 12 tosses. In this case our goal is to determine , note that the event is equal to where , and thus .
Let's now try to compute which is the probability of getting the 7th head on the 12th toss and exactly 7 heads in those 12 tosses. Note that by getting the 7th head on the 12th toss means that in the first 11 tosses exactly 6 heads occurred in any order, and then independently of that you do your last toss so that
Now that we've computed the values of along with we divide them to get the answer.
Let's take a similar approach where this time are the events that the 6th head is on 9th toss and there exactly 7 heads in these 12 tosses respectively, so that our goal is to compute . Recall that we can compute that via , but we know what is from our previous answer, as that was , therefore we just need a way to compute .
is the probability that the 6th head is on the 9th toss and there are exactly 7 heads in 12 tosses. We can break this up into two parts, the first being that you have to get exactly 5 heads in the first 8 tosses, then land another head on the 9th toss and then get exactly 1 head in the last 3 tosses, therefore this probability is given by
Similarly to the previous two questions, we use the definition of conditional probability to write it as a fraction, we will only need to focus on the numerator which is the probability that the 2nd head occurs on the 4th toss and the 6th head occurs on the 9th toss and that there are exactly 7 heads in the first 12 tosses. It can be handled similarly to the above by splitting it into separate binomial computations.
Consider flipping a coin that has probability of coming up heads and of coming up tails, then the random variable which evalutes to 1 if the coin is heads, and 0 if the coin is tails has the bernouilli distribution
Poisson Distribution
The random variable is said to have a poisson distribution with average number of successes iff where . We write
Poisson as a Limit of Binomial
For any suppose we have some , then define . Moreover suppose we have some . If exists denote it by and we have: where
Note that as we see that and where so that therefore as we have that
This also allows us to compute a binomial with a large by approximating it with a poisson.
Number of Trials until Success as a Random Variable
Suppose we have such that , then we may recursively define the function as and for any where is said to count the number of trials required until and include the -th success.
To best understand the above definition consider the sequence which may be of the form and note that , to determine we have and so on. In this way the formula removes the previous instances to find the th instance.
Equivalence
For any we have:
Negative Binomial
The random variable is said to have a negative binomial distribution on successes each with probability iff and we denote this as
Probability Mass Function of the Negative Binomial Distribution
Suppose that for , then for any
Geometric Distribution
We say that the random variable has a geometric distribution denoted by where when
Large Number of Tossed Coins
A coin is tossed 1000 statistically independent times and exactly 3 heads occurs.
If we toss it 1000 times more, and let denote the number of heads we might get this time, estimate the probability that is anywhere between 1 and 5. [hint: ]
Suppose now we toss the coin times and let denote the number of heads in tosses and the random number of trials until we obtain the first head. How large does have to be so that ?
Now determine the probability that the head occurs between 750 and 1250 tosses.
This situation is perfectly modelled by the binomial distribution, ( ) it's just that it's hard to compute the binomial distribution for large values. Therefore we can employ an approximation using the poisson distribution as discussed previously, we want to know and we can approximate each such value, using
...
Therefore by adding all these numbers together we obtain an estimation on the probability of being anywhere between 1 and 5.
We want to determine a value for such that , to understand this concretely if then asks "what is the probability of getting our first head after 5 tosses of the coin")
Poisson Process
Suppose that is a poisson process with and let Determine the following.
and .
and .
.
Recall that as this is the formula for the expected value for the gamma distribution. Also if a random varaible then we have if . Thus we have Since the sequence is IID (why is that idk) then we can say
Fundamental Properties of Covariance
Let be random variables and then we have
Covariance Equations
For any -valued and and any scalars and consider the simple difference . But, if this will determine and .
Assuming that and determine two simultaneous equations thus to obtain the unique and such that
Now supposing that and , determine and
For the assumptions in b), determine the correlation coefficient .
Before doing anything else we consider the following We did this because we knew that the properties of co-variance would result in some expression where some of the sub-expressions are the same as the ones we know information about. One thing that we require is that , we see that So we conclude that . We also need that that is to say in that we have by our assumptions. This also allows us to deduce that therefore we can now observe that so that and .
Sum of Bernoulli
Suppose that IID for , determine the following
Recall that if then we know that . Therefore we can use the binomial formula when trying to compute for some integer . In our specific question we note that the given events are independent (todo explain why), and thus we have
To determine this quantity we first use the definition of conditional probability and then employ the same strategy as above.
Geometric Expectation
Suppose that and that
For the varaible obtain and
Recall that for we know that and an easy way to remember that is that geo is the distribution of how many trials are required before the first success. Note that in this case and . Now let's try to compute the expected value:
Taking a look at then we have
Poisson Quotient
Suppose that is a poisson process where we know that and let and
Observe that the random variable satisfies meaning that , and then we have