πŸ—οΈ Ξ˜ΟΟ΅Ξ·Ξ Ξ±Ο„Ο€πŸš§ (under construction)

Probability Mass Function
Suppose that X is a discrete random varaible, then the probability mass function of X is defined by the function p:R→[0,1] pX(x)=P(X=x)
Bernoulli Random Variable
We say that a random variable X is a Bernoulli random varaible iff there is some p∈[0,1] P(X=1)=pΒ andΒ P(X=0)=1βˆ’p we denote this symbolically as X~bern(p)
Expected Value of a Bernoulli Random Variable
Suppose X~bern(p) then E(X)=p
Binomial Distribution
The random varialbe X is said to have a binomial distribution on n trials, with probability of success per trial p iff X\stackreld=Z1+Z2+β‹―+Zn where Z1,…,ZnΒ IIDΒ Zi~bern(p)
Expected Value of the Binomial Distribution
Suppose that X~binom(n,p) then E(X)=np
Probability Mass Function of the Binomial Distribution
Suppose X~binom(n,p) then pX(k)=(nk)pk(1βˆ’p)nβˆ’k

Note that pX(k) represents the probability of getting exactly k successes in n independent Bernouilli trials each with the same probability.

Coin Counting
A coin is tossed 12 times and 7 heads occur
  • Determine the probability that the 7th head occurs on the 12th toss given that there are exactly 7 heads in these 12 tosses
  • Now determine the probability that the 6th head occurs on the 9th toss given that there are exactly 7 heads in the first 12 tosses
  • Finally determine the probability that the 2nd head toss occurs on the 4th toss and the 6th head occurs on the 9th toss given that there are exactly 7 heads in the first 12 tosses

Consider flipping a coin that has probability ΞΈ of coming up heads and 1βˆ’ΞΈ of coming up tails, then the random variable which evalutes to 1 if the coin is heads, and 0 if the coin is tails has the bernouilli distribution

Poisson Distribution
The random variable N is said to have a poisson distribution with average number of successes λ∈R>0 iff P(N=k)=eβˆ’Ξ»Ξ»kk! where k∈N0. We write N~pois(Ξ»)
Poisson as a Limit of Binomial
For any i∈N suppose we have some pi∈[0,1], then define Ξ»i=iΒ·pi. Moreover suppose we have some Xi~binom(i,pi). If limkβ†’βˆžΞ»k exists denote it by Ξ» and we have: P(N=k)=limnβ†’βˆžP(Xi=k) where N~pois(Ξ»)

This also allows us to compute a binomial with a large n by approximating it with a poisson.

Number of Trials until Success as a Random Variable
Suppose we have (Zn),n∈N1 such that Zi~bern(p), then we may recursively define the function Ti:N1β†’N1 as T1=min({n∈N:Zn=1}) and for any kβ‰₯2 Tk=min({n∈N1:Zn=1}β§΅{T1,…Tkβˆ’1}) where Tk is said to count the number of trials required until and include the k-th success.

To best understand the above definition consider the sequence (Zn=1),n∈N1 which may be of the form (0,0,1,0,1,0,0,1,…) and note that T1=min({3,5,8})=3, to determine T2 we have min({3,5,8}β§΅{3})=5 and so on. In this way the formula removes the previous kβˆ’1 instances to find the kth instance.

Tk=n Equivalence
For any n∈N1 we have: Tk=nβŸΊβˆ‘i=1nβˆ’1Zi=kβˆ’1∧Zn=1
Negative Binomial
The random variable Y is said to have a negative binomial distribution on k successes each with probability p iff Y\stackreld=Tk and we denote this as Y~negbin(k,p)
Probability Mass Function of the Negative Binomial Distribution
Suppose that Y~negbin(k,p) for k∈N1, p∈(0,1] then for any n∈{k,k+1,k+2,…} P(Y=n)=(nβˆ’1kβˆ’1)pk(1βˆ’p)nβˆ’k
Geometric Distribution
We say that the random variable W has a geometric distribution denoted by W~geo(p) where p∈(0,1] when W\stackreld=T1
Large Number of Tossed Coins
A coin is tossed 1000 statistically independent times and exactly 3 heads occurs.
  • If we toss it 1000 times more, and let N denote the number of heads we might get this time, estimate the probability that N is anywhere between 1 and 5. [hint: e3β‰ˆ20.1]
  • Suppose now we toss the coin 1000x times and let Nx denote the number of heads in 1000x tosses and T1 the random number of trials until we obtain the first head. How large does x have to be so that P(T1>x)=1/2 ?
  • Now determine the probability that the 2nd head occurs between 750 and 1250 tosses.
Poisson Process
Suppose that (Tn,nβˆˆβ„•) is a poisson process with Tn~G(n,Ξ»βˆ’1) and let U=T3T8&V=T3T8βˆ’T3. Determine the following.
  • EU and Οƒ(U).
  • EV and Οƒ(V).
  • P(U>1/2).
Fundamental Properties of Covariance
Let X,Y,Z be random variables and c∈R then we have
  • Var(X)=Cov(X,X)
  • Cov(cX,Y)=cΒ·Cov(X,Y)=Cov(X,cY)
  • Cov(X+Y,Z)=Cov(X,Z)+Cov(Y,Z)
  • Cov(X,Y+Z)=Cov(X,Y)+Cov(X,Z)
  • Cov(X,Y)=Cov(Y,X)
  • Cov(X,c)=0
Covariance Equations
For any ℝ-valued X and Y and any scalars s and t consider the simple difference W=Yβˆ’(s+tX). But, if EW=0=cov(X,W) this will determine s and t.
  • Assuming that EX=EY=varX=varX=1 and cov(X,Y)=1/2 determine two simultaneous equations thus to obtain the unique Ξ± and Ξ² such that Y=Ξ±+Ξ²X+WΒ andΒ E(W)=0=cov(X,W).
  • Now supposing that X~gamma(p=3,ΞΈ=2) and Y=Xβˆ’1, determine Ξ± and Ξ²
  • For the assumptions in b), determine the correlation coefficient ρ(Ξ±+Ξ²X,Y).
Sum of Bernoulli
Suppose that Zi IID bern(p=14) for i∈N1, determine the following
  • P(Z1+Z2=0∩Z3+Z4=1∩Z5+Z6=2)
  • P(Z1=0∩Z2+Z4=1βˆ£βˆ‘i=66Zi=3)
Geometric Expectation
Suppose that X∣N~bin(N,35) and that N~geo(23)
  • For the varaible N obtain E(N) and E(NΒ·(Nβˆ’1))
Poisson Quotient
Suppose that Tn,n∈N1 is a poisson process where we know that Tn~gamma(n,15) and let U=T2T5 and V=T2T5βˆ’T2