Core Probability & Random Variables Flashcards

(44 cards)

1
Q

What is a sample space in probability theory?

A

The set of all possible basic outcomes of a random experiment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is an event in probability?

A

Any subset of the sample space, representing a collection of outcomes we care about.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How is the probability of an event informally interpreted?

A

As a number between 0 and 1 representing how likely the event is to occur, with 0 impossible and 1 certain.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the three basic properties any probability measure must satisfy?

A

Non-negativity, P(sample space)=1, and additivity for mutually exclusive events (P(A ∪ B)=P(A)+P(B) when A and B cannot both occur).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the complement of an event A?

A

The event consisting of all outcomes in the sample space that are not in A.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How is the probability of the complement of A related to P(A)?

A

P(Aᶜ) = 1 − P(A).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does it mean for two events A and B to be mutually exclusive (disjoint)?

A

They cannot happen at the same time, so their intersection is empty and P(A ∩ B)=0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the general addition rule for probabilities of two events A and B?

A

P(A ∪ B) = P(A) + P(B) − P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is conditional probability P(A|B)?

A

The probability that event A occurs given that event B has occurred, defined as P(A ∩ B) / P(B) when P(B) > 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can you interpret P(A|B) intuitively?

A

It is the probability of A when we restrict our attention to the subset of outcomes where B has happened.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the multiplication rule relating joint and conditional probability?

A

P(A ∩ B) = P(A|B) · P(B) = P(B|A) · P(A).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does it mean for two events A and B to be independent?

A

Knowing that one occurs gives no information about the other, so P(A ∩ B) = P(A)P(B) and P(A|B) = P(A).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Can events be mutually exclusive and independent at the same time (with nonzero probabilities)?

A

No; if they are mutually exclusive and both have nonzero probability, then P(A ∩ B)=0≠P(A)P(B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Bayes’ theorem for two events A and B?

A

P(A|B) = P(B|A)P(A) / P(B), assuming P(B) > 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Why is Bayes’ theorem important in ML contexts?

A

It provides a way to invert conditional probabilities and update beliefs about hypotheses given observed evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the law of total probability?

A

If {B₁,…,Bₙ} is a partition of the sample space, then P(A) = Σ P(A|Bᵢ)P(Bᵢ).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is a random variable?

A

A function that maps outcomes in the sample space to numeric values, allowing us to analyze numerical aspects of randomness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the difference between a discrete and a continuous random variable?

A

A discrete random variable takes values in a countable set; a continuous random variable takes values in an interval or continuum of real numbers.

19
Q

What is a probability mass function (pmf)?

A

A function p(x) that gives P(X = x) for a discrete random variable X.

20
Q

What two key properties must a pmf satisfy?

A

p(x) ≥ 0 for all x, and the sum over all possible x of p(x) equals 1.

21
Q

What is a probability density function (pdf)?

A

A nonnegative function f(x) such that the probability that a continuous random variable X lies in an interval is given by the integral of f over that interval.

22
Q

Why does it not make sense to ask for P(X = x) for a continuous random variable with a pdf?

A

For a continuous variable, the probability at any specific point is typically 0; probabilities are associated with intervals, not points.

23
Q

What is a cumulative distribution function (cdf)?

A

A function F(x) = P(X ≤ x) that gives the probability that the random variable X is less than or equal to x.

24
Q

How is the cdf of a discrete random variable related to its pmf?

A

F(x) is the sum of p(t) over all t ≤ x.

25
How is the cdf of a continuous random variable related to its pdf?
F(x) is the integral of f(t) from −∞ up to x.
26
What does it mean for two random variables X and Y to be identically distributed?
They share the same probability distribution (same pmf or pdf), even if they are not connected or observed together.
27
What does it mean for random variables X₁,…,Xₙ to be independent?
For any selection of values, the joint probability factorizes into the product of the individual probabilities, e.g., P(X₁=x₁,…,Xₙ=xₙ)=∏ P(Xᵢ=xᵢ).
28
What is a joint distribution of two random variables X and Y?
A function (pmf or pdf) that gives probabilities (or densities) for all pairs (x,y) together, not just individually.
29
What is a marginal distribution?
The distribution of one variable obtained from a joint distribution by summing or integrating over the other variable(s).
30
How do you obtain the marginal pmf of X from a joint pmf of X and Y?
Sum the joint pmf over all possible values of Y: p_X(x) = Σ_y p_{X,Y}(x,y).
31
How do you obtain the marginal pdf of X from a joint pdf of X and Y?
Integrate the joint pdf over all values of Y: f_X(x) = ∫ f_{X,Y}(x,y) dy.
32
What is the conditional distribution of Y given X=x?
A distribution with pmf or pdf proportional to the joint distribution at X=x, normalized so probabilities sum or integrate to 1.
33
How is conditional pmf p(Y=y | X=x) related to the joint and marginal pmfs?
p(Y=y | X=x) = p_{X,Y}(x,y) / p_X(x), assuming p_X(x) > 0.
34
How is the concept of independence expressed in terms of joint and marginal distributions?
X and Y are independent if and only if their joint distribution factorizes: p_{X,Y}(x,y) = p_X(x)p_Y(y) or f_{X,Y}(x,y)=f_X(x)f_Y(y).
35
What is the expectation (mean) of a random variable at a high level?
The long-run average value of the variable if we could repeat the random experiment many times.
36
How is the expectation of a discrete random variable X with pmf p(x) defined?
E[X] = Σ x · p(x), summing over all possible values x.
37
How is the expectation of a continuous random variable X with pdf f(x) defined?
E[X] = ∫ x · f(x) dx, integrating over the support of X.
38
What does linearity of expectation mean?
For any random variables X and Y and constants a, b, E[aX + bY] = aE[X] + bE[Y], regardless of whether X and Y are independent.
39
What is the variance of a random variable at a high level?
A measure of how much the variable’s values spread around its mean.
40
How is variance Var(X) defined in terms of expectation?
Var(X) = E[(X − E[X])²].
41
What is the standard deviation of a random variable?
The square root of the variance, providing spread in the same units as the original variable.
42
What is covariance between two random variables X and Y (intuitively)?
A measure of how X and Y vary together, positive if they tend to go up and down together, negative if they move in opposite directions.
43
Why is correlation often used instead of raw covariance?
Correlation normalizes covariance to lie between −1 and 1, making it easier to interpret strength and direction of linear relationships.
44
Why does zero covariance (or correlation) not necessarily imply independence?
Because two variables can have a nonlinear relationship that makes covariance zero, yet still be statistically dependent in a more complex way.