Probability Class 12 Notes

Probability is the final chapter in Class 12 Maths and one of the most important for CBSE boards (6–8 marks). Building on Class 11 basics, this chapter introduces conditional probability, Bayes’ theorem, and probability distributions. These concepts are widely used in data science, machine learning, and everyday decision-making.

Key Concepts

1. Conditional Probability

The probability of event E given that event F has already occurred:

P(E|F) = P(E ∩ F) / P(F), provided P(F) ≠ 0

Properties:
• P(S|F) = 1 (where S is sample space)
• P(E’|F) = 1 − P(E|F)
• P((A ∪ B)|F) = P(A|F) + P(B|F) − P(A ∩ B|F)
💡 Think of it this way: Conditional probability “shrinks” the sample space to F. We only look at outcomes within F and check how many of those also belong to E.

2. Multiplication Theorem

P(A ∩ B) = P(A) × P(B|A) = P(B) × P(A|B)

For three events:
P(A ∩ B ∩ C) = P(A) × P(B|A) × P(C|A ∩ B)

3. Independent Events

Events A and B are independent if:
P(A ∩ B) = P(A) × P(B)

Equivalently: P(A|B) = P(A) and P(B|A) = P(B)
(Knowing one event doesn’t change the probability of the other)
💡 Independent ≠ Mutually Exclusive!
• Mutually exclusive: A ∩ B = ∅ (can’t happen together)
• Independent: knowing one doesn’t affect the other
• If A and B are mutually exclusive with P(A) > 0, P(B) > 0, they CANNOT be independent!

4. Total Probability Theorem

If E₁, E₂, …, Eₙ form a partition of sample space S (mutually exclusive and exhaustive), then for any event A:

P(A) = P(E₁)P(A|E₁) + P(E₂)P(A|E₂) + … + P(Eₙ)P(A|Eₙ)

= ∑ P(Eᵢ) × P(A|Eᵢ)

5. Bayes’ Theorem

P(Eᵢ|A) = P(Eᵢ) × P(A|Eᵢ) / ∑ P(Eⱼ) × P(A|Eⱼ)

In words: Posterior = (Prior × Likelihood) / Evidence
💡 When to use Bayes’ Theorem: When you know the “forward” probabilities (cause → effect) and need to find the “reverse” probability (effect → cause). Example: Given that a product is defective, what’s the probability it came from factory A?

6. Random Variable and Probability Distribution

A random variable X is a real-valued function on the sample space. It assigns a number to each outcome.

Probability Distribution: A table listing all possible values of X with their probabilities.

Requirements:
• 0 ≤ P(X = xᵢ) ≤ 1 for each value
∑ P(X = xᵢ) = 1 (probabilities must sum to 1)

7. Mean and Variance of a Random Variable

Mean (Expected Value): E(X) = μ = ∑ xᵢ × P(xᵢ)

Variance: Var(X) = E(X²) − [E(X)]² = ∑ xᵢ² × P(xᵢ) − μ²

Standard Deviation: σ = √Var(X)

8. Bernoulli Trials and Binomial Distribution

Bernoulli Trial: An experiment with exactly two outcomes — Success (p) and Failure (q = 1−p)

Binomial Distribution: For n independent Bernoulli trials:
P(X = r) = ⁿCᵣ × pʳ × q⁽ⁿ⁻ʳ⁾

where r = 0, 1, 2, …, n

Mean: E(X) = np
Variance: Var(X) = npq
💡 Identifying Binomial Distribution: Check these conditions:
1. Fixed number of trials (n)
2. Each trial has exactly 2 outcomes
3. Probability of success (p) is constant across trials
4. Trials are independent

Important Definitions

TermDefinition
Conditional ProbabilityP(A|B) — probability of A given B has occurred
Independent EventsEvents where P(A ∩ B) = P(A) × P(B)
Partition of SEvents E₁, E₂, …, Eₙ that are mutually exclusive and exhaustive
Bayes’ TheoremFormula to find reverse conditional probability
Random VariableReal-valued function defined on sample space
Binomial DistributionDistribution for n independent Bernoulli trials

Solved Examples — NCERT-Based

Example 1: A fair die is rolled. If E = {1,3,5} and F = {2,3}, find P(E|F).

Solution:

E ∩ F = {3}

P(E ∩ F) = 1/6, P(F) = 2/6 = 1/3

P(E|F) = P(E ∩ F)/P(F) = (1/6)/(1/3) = 1/2

Example 2 (Bayes’ Theorem): Bag I has 3 red, 4 black balls. Bag II has 5 red, 6 black balls. One bag is chosen at random and a ball is drawn. If the ball is red, find the probability it came from Bag I.

Solution:

Let E₁ = Bag I chosen, E₂ = Bag II chosen, A = red ball drawn

P(E₁) = 1/2, P(E₂) = 1/2

P(A|E₁) = 3/7, P(A|E₂) = 5/11

By Bayes’ theorem:

P(E₁|A) = P(E₁)×P(A|E₁) / [P(E₁)×P(A|E₁) + P(E₂)×P(A|E₂)]

= (1/2 × 3/7) / (1/2 × 3/7 + 1/2 × 5/11)

= (3/14) / (3/14 + 5/22) = (3/14) / (33+35)/(154) = (3/14) / (68/154)

= (3/14) × (154/68) = (3 × 11)/68 = 33/68

Example 3: Find the probability distribution of number of heads in 3 tosses of a fair coin.

Solution:

n = 3, p = 1/2, q = 1/2. X = number of heads (0, 1, 2, 3)

X0123
P(X)³C₀(1/2)³ = 1/8³C₁(1/2)³ = 3/8³C₂(1/2)³ = 3/8³C₃(1/2)³ = 1/8

Mean = np = 3 × 1/2 = 3/2

Variance = npq = 3 × 1/2 × 1/2 = 3/4

Example 4: The probability that a student passes maths is 2/3 and physics is 4/9. Assuming independence, find the probability of passing at least one subject.

Solution:

P(M) = 2/3, P(P) = 4/9

P(at least one) = 1 − P(none) = 1 − P(M’) × P(P’)

= 1 − (1/3)(5/9) = 1 − 5/27 = 22/27

Important Questions for Board Exams

1 Mark Questions

  1. If P(A) = 0.6, P(B) = 0.3 and P(A ∩ B) = 0.2, find P(A|B).
  2. If A and B are independent events with P(A) = 0.3 and P(B) = 0.4, find P(A ∩ B).

2 Mark Questions

  1. A couple has 2 children. Find the probability that both are boys given that at least one is a boy.
  2. A die is thrown twice. Events A = “sum is 8” and B = “first throw is 4”. Are A and B independent?

3 Mark Questions

  1. Find the mean and variance of the number of heads in 4 tosses of a coin.
  2. A and B independently try to solve a problem. P(A solves) = 1/3, P(B solves) = 1/4. Find the probability that the problem is solved.

5 Mark Questions

  1. Three machines A, B, C produce 25%, 35%, 40% of a factory’s output. Defect rates are 5%, 4%, 2% respectively. A randomly chosen item is defective. Find the probability it was produced by machine C. (Bayes’ theorem)
  2. A random variable X has the distribution: P(X=0) = 3k³, P(X=1) = 4k−10k², P(X=2) = 5k−1. Find k, mean, and variance.

Quick Revision Points

  • P(A|B) = P(A∩B)/P(B) — “reduce” the sample space to B
  • Independent ⟹ P(A∩B) = P(A)×P(B); do NOT confuse with mutually exclusive
  • Bayes’ theorem reverses conditional probability: effect → cause
  • Total probability = weighted sum over partition
  • Probability distribution: all P(X) must be between 0 and 1, and sum to 1
  • Mean E(X) = ∑xᵢP(xᵢ), Variance = E(X²) − [E(X)]²
  • Binomial: P(X=r) = ⁿCᵣ pʳ qⁿ⁻ʳ, Mean = np, Variance = npq
  • Bayes’ theorem problems are very common (5 marks) — practice the table method!

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top