Maths Flashcards

(10 cards)

1
Q

What is the cost function for Linear Regression (Mean Squared Error)?

A

J(θ) = (1/2n) Σ (yᵢ - (θ₀ + θ₁xᵢ))²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the hypothesis function for Linear Regression?

A

h(x) = θ₀ + θ₁x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the logistic (sigmoid) function formula?

A

σ(z) = 1 / (1 + exp(-z))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the cost function for Logistic Regression?

A

J(θ) = -(1/n) Σ [yᵢ log(h(xᵢ)) + (1 - yᵢ) log(1 - h(xᵢ))]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the Ridge Regression cost function?

A

J(θ) = (1/2n) Σ (yᵢ - h(xᵢ))² + λ Σ θⱼ²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the Lasso Regression cost function?

A

J(θ) = (1/2n) Σ (yᵢ - h(xᵢ))² + λ Σ |θⱼ|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the formula for the Bayes’ theorem?

A

P(A|B) = [P(B|A) × P(A)] / P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the equation for a simple decision boundary in LDA?

A

δₖ(x) = xᵀΣ⁻¹μₖ - (1/2)μₖᵀΣ⁻¹μₖ + log(πₖ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the PCA objective function?

A

Maximize variance = maximize (1/n) Σ (zᵢ)² where zᵢ are projections onto components.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the update rule for Gradient Descent?

A

θ := θ - α ∇J(θ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly