Chapters 1,2,3 Flashcards

1
Q
  • sample space
  • event
  • probability associated
A
  • set of possible outcomes
  • subset of sample space
  • probability associated P[A] ∈ [0,1] is numerical measurement from outcome. A function X: S-> R - associated to each element in S is random variable X
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition: the distribution function

A

The distribution function of the random variable X is the function F_X : R -> [0,1] given by F_X(x) = P[X (less than or equal to) x]

This is not PDF and also known as cumulative distribution

Doesn’t have to be continuous! Eg discrete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Discrete random variable vs continuous random variable

A

Discrete: integer valued and values in finite/countable set.
p(x) =P[X=x]. R_X is subset of Z.
Probability function of X is p(x). F_X jumps upwards at each x and size shows p(x) = F(x) - F(x-1)??, p(x)=0 if x is not in R_X.

Continuous: take values in R , not countable. Has PDF- probability density function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

PDF?

Probability density function

A

A function f:R->E is a probability density function if:
1) f(x) is “bigger than or equal to” 0 for all x ∈R

2) integral( -∞, ∞) of f(x) .dx =1
Ie PDF INTEGRATES TO GIVE 1

• if f(x) is a PDF then there is a random variable X such that the distribution function of X satisfies F_X(x) =integral( -∞, x) of f(u) .du

Remover for continuous functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Relation between PDF and probability distribution first

A
  • if f(x) is a PDF then there is a random variable X such that the distribution function of X satisfies F_X(x) =integral( -∞, x) of f(u) .du
  • integral on [-∞,x] of the PDF gives the cumulatative distribution function F_X(x)= P[X less than or equal to x]

Remember continuous PDF!!!

d/dx of F(x) = f(x)
PDF integrates to distribution
Distribution differentiates to PDF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Relation between PDF and probability distribution second

A
If we can write the distribution of a random variable X in the form F_X(x) = integral over ( -∞, x) of f_X(x).dx 
where f_X(x) is a PDF
Then X is a continuous random variable 

• only continuous X have PDF as discrete isn’t continuous and probability distribution not differentiable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Finding P[ x≤ X ≤y]
P[X=x]
(Continuous case)

A

P[ x≤ X ≤y] =F(y) - F(x) (distribution function)
= integral over [x,y] of PDF f(t) .dt

•in the continuous case P[X=x] = integral over [x,x] of f(u).du =0 (PDF gives 0) for all x. Chance of particular value is 0.
For discrete case this is isn’t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

For a distribution function:

A

For a general function F:R-> [0,1] we sat that F is a distribution function if:
1) 0 ≤ F(x) ≤ 1 with limit as x tends to -∞ of F(x) =0 and limit as x tends to ∞ of F(x) =1.

2) F(x) is non decreasing in x, if x
(3) F is right continuous with left limits)

At every point x_o in R both one sided limits exist and f(x_o)=limit as x -> x_o - of f(x)
Continuous on right.
Conversely if we have a function F satisfying the properties there exists a random variable X with distribution function F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Some values for PDF and probability distributions

A

PDF f must be non-negative, F cannot decrease, but PDF is not a distribution function so doesn’t need to satisfy 1, 2 or 3.

f can be greater than 1 for some values of x.
P(X=x) not equal to f(x)

If the density has a large value over a small region then probability is comparable to value x size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Discrete and continuous expectation.

Expectation of a function of X.

A

Discrete:
E(x) = sigma( x p(x)) for x in R_x
E(x) = integral of (-∞, ∞) xf(x) .dx

In general: For g(X) function of X
E(g(x)) = sigma( g(x)p(x)) or integral over -∞,∞ of g(x)f(x) .dx
Discrete or continuous
µ = µ_x = E[X]

g(X) function of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

rth moment E[x^r]

A

rth moment: g(x) = x^r r in N formula or rth moment E[x^r]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Variance

A

Var(X) = E [(X - µ) ^2] = E[X^2] - µ^2

Sigma squared = variance of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Random variables without a mean:

Example is Cauchy distribution

A

If sum or integral in the definition of the mean doesn’t converge. Ie the mean doesn’t exist.

Let X be a random variable with probability density function:

f(x) = 1/( π (1 + x^2)) a random variable with this PDF is said to have a Cauchy distribution

Calculating mean of X: integral( -∞,∞) x/ ( π (1 + x^2)) .dx = lim as s and t tend to infinity of
integral( -t,s) x/ ( π (1 + x^2)) .dx = lim as s and t tend to infinity of [ ln(1+x^2) / 2π] t and -s
= lim as s and t tend to infinity of (1/2π) (log(1+t^2) - log(1+s^2))
This doesn’t have a well definite limit hence mean is undefined.

Cauchy distribution was not the only example of a distribution without a defined finite mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The weak law of large numbers

A

The weak law of large numbers states that if we have a sequence of independent random variables X_1, X_2,… with the same distribution and with mean mu, then for any ε>0, as n tends to infinity.
P[ |Xbar_n - μ| > ε ] tends to 0 where Xbar_n = (1/n) sum of (n to i=1) X_i

Ie when it exists mean is the long term average of samples for distribution without defined mean

When X1,…,X_n are independent random variables with Cauchy, Xbar n also has regardless of n. So sample mean does not tend to mu for large n. (without a defined finite mean)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Properties of the normal distribution

For X normal distribution

A

aX+b ~ N( aμ +b , a²σ²) given in exam

Standardise normal Z= (x-μ)/ σ(given in exam)

•FOR INDEPENDENT VARIABLES
Sum of n independent X_i
~N ( sum of μ_is, sum of σ²_i’s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

THE GAMMA FUNCTION

A

Γ: (0,∞ ) to R

Γ(α) =
integral over (0,∞)
Of
u^{α-1} • exp^(-u) .du

17
Q

Lemma2: relating to gamma function

A

•Γ(1) = 1

•For α bigger than 1
Γ(α) = (α-1)Γ(α-1)

• for n=1,2,3,…
Γ(n) = (n-1)!

18
Q

THE BETA FUNCTION

A

For α,β bigger than 0.

B(α,β) = (Γ(α)Γ(β)) / ( Γ(α+β))

Prod over sum

19
Q

Lemma 2.3 relating to gamma function and beta

A
integral over (0,∞) 
Of 
u^{α-1} • exp^(-βu) .du

= Γ(α)/β^(α)

βu instead of u in gamma function

20
Q

Properties of variables with the gamma distribution

A

•For two independent variables with gamma distribution with same beta:

X₁~ Ga( α₁ , β). X₂~ Ga( α₂,β). Independent then
X₁+X₂ ~ Ga( α₁+α₂, β)

• sum of n independent Exp(λ) variables is Ga(n, λ)

(Given Ga( 1, lambda) = Exp(lambda))

21
Q

UNIVARIATE TRANSFORMS LEMMA 3.1

A

Suppose that g:R_X to R is

STRICTLY MONOTONE on Rx
Then the PDF!!!! Of Y=g(X)

is

f_Y (y) =

{ f_X ( g-1 (y)) • | dg-1 /dy| for y in g(Rx)
{ 0 o/w

Steps:
• check PDF of X identify Rx
• Function of g st Y= g(X)
• check STRICTLY MONOTONE find g-1 , dg-1/dy, g(Rx)

Use sketch!!!

22
Q

Linear transforms lemma 3.1 when not strictly monotone on range

A

Check specific region and use probabilities to define Fx(x)

Differentiate for PDF.

Eg X^2 implies X between -sqrtx and sqrtx, use Integrals to find this probability

Check the range!!! Be careful!!!!