Complete Summary and Solutions for Probability – NCERT Class XII Mathematics Part II, Chapter 13 – Probability Theory, Random Experiments, Events, Axioms, and Theorems
Comprehensive summary and detailed explanation of Chapter 13 'Probability' from the NCERT Class XII Mathematics Part II textbook, covering concepts of probability theory, random experiments, sample space, events, classical and axiomatic definitions of probability, properties and laws of probability, conditional probability, independent events, Bayes' theorem, with solved examples and all NCERT exercises and solutions.
Updated: 1 day ago

Probability
Chapter 13: Mathematics - Ultimate Study Guide | NCERT Class 12 Notes, Solved Examples, Exercises & Quiz 2025
Full Chapter Summary & Detailed Notes - Probability Class 12 NCERT
The theory of probabilities is simply the Science of logic quantitatively treated. – C.S. PEIRCE
13.1 Introduction
In earlier classes, we studied probability as a measure of uncertainty in random experiments. We discussed Kolmogorov's axiomatic approach, treating probability as a function of outcomes, and established equivalence with classical probability for equally likely outcomes. We obtained probabilities for events in discrete sample spaces and the addition rule. This chapter introduces conditional probability (P(E|F)), Bayes' theorem, multiplication rule, independence, random variables, probability distributions, mean/variance, and the binomial distribution. We assume equally likely outcomes unless stated otherwise.
Conceptual Diagram: Sample Space for Coin Tosses (Tree Diagram Style)
Experiment: Toss three fair coins. Sample space S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}, each with P=1/8.
$$ S = \{ \text{HHH, HHT, HTH, THH, HTT, THT, TTH, TTT} \}$$Event E: At least two heads = {HHH, HHT, HTH, THH}, P(E)=4/8=1/2.
Visualize as a tree: First coin branches to H/T, second to H/T, third to H/T, counting paths for events.
Why This Guide Stands Out (Expanded for 2025 Exams)
Comprehensive coverage mirroring NCERT pages 406-450+: All subtopics point-wise with evidence (e.g., coin toss sample space), full examples (e.g., conditional on first tail), derivations (e.g., P(E|F)=P(E∩F)/P(F)). Added 2025 relevance: Probability in AI (Bayesian networks for predictions). Processes for Bayes/multiplication with step-by-step. Proforma: Sample space → Events → P(E|F) verification. Includes random variables, binomial PMF.
13.2 Conditional Probability
Does occurrence of one event affect another's probability? For equally likely outcomes, P(E|F) is the probability of E given F occurred, reducing sample space to F.
Definition: If E, F events, P(F)≠0, then $$ P(E|F) = \frac{P(E \cap F)}{P(F)} $$
From classical: $$ P(E|F) = \frac{n(E \cap F)}{n(F)} $$
13.2.1 Properties of Conditional Probability
- Property 1: P(S|F)=P(F|F)=1 (certainty given F).
- Property 2: P((A∪B)|F)=P(A|F)+P(B|F)-P((A∩B)|F); if disjoint, sum.
- Property 3: P(E'|F)=1-P(E|F) (complement).
Derivation: Property 2 (Addition Rule Conditional)
Step 1: P((A∪B)|F)=P(((A∩F)∪(B∩F)))/P(F). Step 2: Distributive: (A∩F)∪(B∩F)=(A∪B)∩F. Step 3: If disjoint, P((A∩B)|F)=0. Verification: Matches unconditional when F=S.
Example 1 (Integrated: Coin Toss Conditional)
Three coins: S as above. E: ≥2 heads, F: First tail. E={HHH,HHT,HTH,THH}, F={THH,THT,TTH,TTT}, E∩F={THH}.
P(E)=4/8=1/2, P(F)=4/8=1/2, P(E∩F)=1/8. Thus P(E|F)=(1/8)/(1/2)=1/4.
Example 2 (Integrated: Family Children)
Two children: S={(b,b),(g,b),(b,g),(g,g)}. E: Both boys={(b,b)}, F: ≥1 boy={(b,b),(g,b),(b,g)}.
P(F)=3/4, P(E∩F)=1/4, P(E|F)=1/3.
Example 3 (Integrated: Cards Even >3)
S={1..10}. A: Even={2,4,6,8,10}, B: >3={4..10}, A∩B={4,6,8,10}.
P(B)=7/10, P(A∩B)=4/10, P(A|B)=4/7.
Example 4 (Integrated: School Girls Class XII)
1000 students, 430 girls, 10% girls in XII: P(F)=0.43, P(E∩F)=0.043, P(E|F)=0.1.
Example 5 (Integrated: Die Three Times A Given B)
A: 4 on third (36 outcomes), B: 6 first & 5 second (6 outcomes), A∩B=1, P(A|B)=1/6.
Example 6 (Integrated: Die Twice Sum 6, ≥1 Four)
F: Sum=6 (5 outcomes), E: ≥1 four (11 total, E∩F=2), P(E|F)=2/5.
Example 7 (Integrated: Coin then Die)
S={(H,H),(H,T),(T,1)..(T,6)}. F: ≥1 tail (7 outcomes, P=3/4), E: Die >4={(T,5),(T,6)}, P(E∩F)=2/12=1/6, P(E|F)=(1/6)/(3/4)=2/9.
Quick Table: Conditional Probability Examples (Expanded)
| Example | P(E) | P(F) | P(E∩F) | P(E|F) |
|---|---|---|---|---|
| Coins ≥2H | First T | 1/2 | 1/2 | 1/8 | 1/4 |
| Children Both B | ≥1 B | 1/4 | 3/4 | 1/4 | 1/3 |
| Cards Even | >3 | 1/2 | 7/10 | 4/10 | 4/7 |
| Die 4 on 3rd | 6&5 first two | 1/6 | 1/36 | 1/216 | 1/6 |
| Die Twice ≥1 4 | Sum=6 | 11/36 | 5/36 | 2/36 | 2/5 |
| Coin-Die >4 | ≥1 T | 1/6 | 3/4 | 1/6 | 2/9 |
13.3 Multiplication Theorem on Probability
From conditional: P(E∩F)=P(E)P(F|E)=P(F)P(E|F). For independent: P(E∩F)=P(E)P(F).
13.4 Independent Events
E, F independent if P(E∩F)=P(E)P(F), or P(E|F)=P(E).
13.5 Bayes' Theorem
$$ P(E_i | A) = \frac{P(E_i) P(A | E_i)}{\sum P(E_k) P(A | E_k)} $$ For partition {E_i}.
Derivation: Bayes' (From Total Probability)
Step 1: P(A)=∑ P(E_i) P(A|E_i). Step 2: P(E_i|A)=P(E_i ∩ A)/P(A)=P(E_i) P(A|E_i)/P(A). Step 3: Insert total prob. Verification: Normalizes posteriors.
13.6 Random Variables and Probability Distributions
Random variable X: Real-valued function on S. PMF: P(X=x_i)=p_i, ∑p_i=1.
13.7 Mean and Variance
Mean μ=E(X)=∑ x_i p_i. Variance σ²= E(X²)-[E(X)]²=∑(x_i-μ)² p_i.
13.8 Bernoulli Trials and Binomial Distribution
n independent trials, success p, X~Bin(n,p): P(X=k)= C(n,k) p^k (1-p)^{n-k}.
Mean np, Variance np(1-p).
Summary & Exercises Tease
Key: Conditional refines probs; Bayes updates beliefs; Binomial models counts. Exercises: Conditionals (13.1), Independence/Bayes (13.2), Random Var/Binomial (13.3-13.8).
Group Discussions
No forum posts available.


