⚠ Switch to EXCALIDRAW VIEW in the MORE OPTIONS menu of this document. ⚠ You can decompress Drawing data with the command palette: ‘Decompress current Excalidraw file’. For more info check in plugin settings under ‘Saving’
Excalidraw Data
Text Elements
Overfit Decision Tree
Pruned Decision Tree
Age > 30?
Yes
No
Income > 50K?
Clicks > 5?
Visits > 10?
Buy: 82%
Time > 2m?
No Buy: 91%
Buy: 51%
Buy: 53%
No: 48%
No: 52%
Age > 30?
Yes
No
Income > 50K?
Clicks > 5?
Buy: 52%
Buy: 82%
No Buy: 65%
No Buy: 91%
Pruning
❌ Problems with Overfit Tree:
• Deep branches with ~50% confidence
• Memorizes training noise
• Poor generalization to new data
• More leaf nodes (high complexity)
✓ Benefits of Pruned Tree:
• Removes low-confidence branches
• Better generalization
• More interpretable (fewer rules)
• Less leaf nodes (lower complexity)
Cost-Complexity Pruning Criterion: Rα(T) = R(T) + α|T|
Higher α → More aggressive pruning → Simpler tree
Pruned branches
Pruned branches