You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
14.**October 28**: Preconditioning and Overview of Machine Learning
84
-
-**SLIDES**: [Iterative Methods](https://ubcecon.github.io/ECON622/lectures/lectures/iterative_methods.html) and [Intro to ML](https://ubcecon.github.io/ECON622/lectures/lectures/intro_to_ml.html)
83
+
14.**October 28**: Overview of Machine Learning
84
+
-**SLIDES**: [Intro to ML](https://ubcecon.github.io/ECON622/lectures/lectures/intro_to_ml.html)
85
85
- Finalize discussion of iterative methods and preconditioning
86
86
- Introduce key concepts about supervised, unsupervised, reinforcement learning, semi-supervised, kernel-methods, deep-learning, etc.
87
+
- Basic introduction to JAX and Python frameworks
87
88
15.**October 30**: Differentiable everything! JAX and Auto-Differentiation/JVP/etc.
88
-
-**SLIDES**: Finish [Intro to ML](https://ubcecon.github.io/ECON622/lectures/lectures/intro_to_ml.html) and start [Differentiation](https://ubcecon.github.io/ECON622/lectures/lectures/differentiation.html)
17.**November 6**: Stochastic Optimization Methods and Machine Learning Pipelines
97
98
-**SLIDES**: SGD variations in [Optimization](https://ubcecon.github.io/ECON622/lectures/lectures/optimization.html)
99
+
- W&B sweeps, and code in `lectures/lectures/examples`-
98
100
- SGD and methods for variance reduction in gradient estimates
99
101
- Using SGD-variants in practice within ML pipelines in JAX and Pytorch
100
-
18.**November 18**: Machine Learning Pipelines, HPO, and ERM
101
-
-**SLIDES**: Finished example code of pipelines in [Optimization](https://ubcecon.github.io/ECON622/lectures/lectures/optimization.html), W&B sweeps, and code in `lectures/lectures/examples`
102
102
-**Readings**: [Probabilistic Machine Learning: An Introduction](https://probml.github.io/pml-book/book1.html) Section 5.4 on ERM
-**SLIDES**: [Deep Learning and Representation Learning](https://ubcecon.github.io/ECON622/lectures/lectures/deep_learning.html) and started [Double-Descent and Regularization](https://ubcecon.github.io/ECON622/lectures/lectures/overparameterization.html)
105
105
-**Readings**
106
106
-[Probabilistic Machine Learning: An Introduction](https://probml.github.io/pml-book/book1.html) Section 13.2.1 to 13.2.6 on MLPs and the importance of depth
@@ -109,7 +109,7 @@ Slides for the lectures can be found [here](https://ubcecon.github.io/ECON622/le
109
109
-[Mark Schmidt's CPSC440 Notes on Double-Descent Curves](https://www.cs.ubc.ca/~schmidtm/Courses/440-W22/L7.pdf) (see [CPSC340](https://www.cs.ubc.ca/~schmidtm/Courses/340-F22/L32.pdf) lectures for a more basic treatment of these topics)
110
110
-**Optional Extra Material**
111
111
-[Probabilistic Machine Learning: Advanced Topics](https://probml.github.io/pml-book/book2.html) Section 32 on representation learning
112
-
20.**November 25** Finish Double-Descent and Intro to Kernel Methods and Gaussian Processes
112
+
19.**November 20** Finish Double-Descent and Intro to Kernel Methods and Gaussian Processes
113
113
-**SLIDES**: [Kernel Methods and Gaussian Processes](https://ubcecon.github.io/ECON622/lectures/lectures/kernel_methods.html) and finish [Double-Descent and Regularization](https://ubcecon.github.io/ECON622/lectures/lectures/overparameterization.html)
114
114
-**Readings**
115
115
- If you didn't do it already, read [Mark Schmidt's CPSC440 Notes on Double-Descent Curves and Overparameterization](https://www.cs.ubc.ca/~schmidtm/Courses/440-W22/L7.pdf) (see [CPSC340](https://www.cs.ubc.ca/~schmidtm/Courses/340-F22/L32.pdf) lectures for a more basic treatment of these topics)
@@ -119,13 +119,12 @@ Slides for the lectures can be found [here](https://ubcecon.github.io/ECON622/le
119
119
-[Probabilistic Machine Learning: Advanced Topics](https://probml.github.io/pml-book/book2.html) Section 18.1 to 18.3 on GPs and kernels
120
120
- Researchers working in GPs love the online textbook [Gaussian Processes for Machine Learning](https://gaussianprocess.org/gpml/chapters/), so you may want to read the intro section on [GP Regression](https://gaussianprocess.org/gpml/chapters/RW2.pdf)
0 commit comments