Regularization and Ensembles

Week 05, Fall 2023

Summary

This week we will continue discussing the supervised learning regression task, but we will introduce extensions of methods that we have already seen. First we will look at using regularization to improve linear models. Specifically, we’ll looke at lasso, ridge, and elastic net regression. Next, we’ll introduce the notion of an ensemble method. In particular, we’ll use the decision trees we already know as the base learners for random forests and boosted models.

Learning Objectives

After completing this week, you are expected to be able to:

  • Understand how the ridge and lasso constraints lead to shrunken and spare estimates.
  • Use ridge regression to perform regression.
  • Use lasso to perform regression.
  • Understand how averaging the predictions from many trees (for example a random forest) can improve model performance.
  • Use a random forest to perform regression.
  • Use boosting to perform regression.

Reading

Link Source
Week 05 Concept Scribbles Course Website
Week 05 Notebook [ Rendered Notebook ] Course Website

Video

Head to ClassTranscribe to watch lecture recordings. They are arranged by date in the Lecture Capture Recordings playlist.

Assignments

Assignment Deadline Credit
Lab 04 [ Template ] Thursday, September 28 100%
Homework 04 Thursday, September 28 105%

Office Hours

Staff Day Time Location
David Monday 11:00 AM - 12:00 PM 2328 Siebel Center
Lahari Wednesday 4:00 PM - 5:00 PM Siebel Center, Second Floor [ Queue ]
David Wednesday 5:00 PM - 6:00 PM Zoom
Eunice Thursday 3:00 PM - 4:00 PM Siebel Center, Second Floor [ Queue ]