Skip to content

This demonstrates how to use decision trees for classification and understand the foundations of ensemble methods like random forests and boosted trees. Part of Week 4 from the Machine Learning Specialization (Course 2) by Andrew Ng.

License

Notifications You must be signed in to change notification settings

Adit-Mugdha-das/Decision-Trees-and-Tree-Ensembles

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Decision Trees

This assignment demonstrates how to use decision trees for classification and understand the foundational concepts behind ensemble methods such as random forests and boosted trees.
It is part of Week 4 (Course 2: Advanced Learning Algorithms) from the Machine Learning Specialization by Andrew Ng on Coursera.

Description

In this lab, I implemented and trained a decision tree classifier using real-world data, explored how decision boundaries are formed, and visualized decision paths. I also learned how ensembles like bagging and boosting build on simple tree models to improve prediction accuracy and robustness.

Key Concepts Covered:

  • Decision tree construction and splitting criteria
  • Tree depth, overfitting, and pruning
  • Visualizing decision boundaries
  • Introduction to tree ensemble methods (Random Forest, Gradient Boosting)
  • Trade-offs between bias and variance in tree models

Files Included

  • decision_tree_lab.ipynb: Main notebook with all exercises and visualizations
  • lab_utils_tree.py: Helper functions for plotting, metrics, and decision boundaries
  • data/: Dataset files used for training and evaluation

⚠️ This repository includes only my own implementation and adheres to Coursera’s Honor Code.

Tools Used

  • Python 3
  • NumPy
  • scikit-learn
  • Matplotlib
  • Jupyter Notebook

Course Info

This assignment is part of:

Machine Learning Specialization
Instructor: Andrew Ng
Course 2: Advanced Learning Algorithms
Week 4: Tree Ensembles – Decision Trees, Random Forests, Boosting

License

This repository is intended for educational and portfolio use only. Not for direct submission on Coursera.


Star the repo if you're building tree-based models or exploring ensemble learning!

About

This demonstrates how to use decision trees for classification and understand the foundations of ensemble methods like random forests and boosted trees. Part of Week 4 from the Machine Learning Specialization (Course 2) by Andrew Ng.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published