You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/LectureNotes/_build/html/week48.html
+39-39Lines changed: 39 additions & 39 deletions
Original file line number
Diff line number
Diff line change
@@ -527,7 +527,7 @@ <h2> Contents </h2>
527
527
<!-- dom:TITLE: Week 48: Gradient boosting and summary of course --><sectionclass="tex2jax_ignore mathjax_ignore" id="week-48-gradient-boosting-and-summary-of-course">
528
528
<h1>Week 48: Gradient boosting and summary of course<aclass="headerlink" href="#week-48-gradient-boosting-and-summary-of-course" title="Link to this heading">#</a></h1>
529
529
<p><strong>Morten Hjorth-Jensen</strong>, Department of Physics and Center for Computing in Science Education, University of Oslo, Norway</p>
530
-
<p>Date: <strong>Nov 24, 2024</strong></p>
530
+
<p>Date: <strong>Nov 25, 2024</strong></p>
531
531
<p>Copyright 1999-2024, Morten Hjorth-Jensen. Released under CC Attribution-NonCommercial 4.0 license</p>
532
532
<sectionid="overview-of-week-48">
533
533
<h2>Overview of week 48<aclass="headerlink" href="#overview-of-week-48" title="Link to this heading">#</a></h2>
@@ -542,13 +542,13 @@ <h2>Lecture Monday, November 25<a class="headerlink" href="#lecture-monday-novem
542
542
</ol>
543
543
<p>a. These lecture notes at <aclass="github reference external" href="https://github.com/CompPhysics/MachineLearning/blob/master/doc/pub/week48/ipynb/week48.ipynb">CompPhysics/MachineLearning</a></p>
544
544
<p>b. See also lecture notes from week 47 at <aclass="github reference external" href="https://github.com/CompPhysics/MachineLearning/blob/master/doc/pub/week47/ipynb/week47.ipynb">CompPhysics/MachineLearning</a>. The lecture on Monday starts with a repetition on AdaBoost before we move over to gradient boosting with examples</p>
545
-
<!-- o Video of lecture at <https://youtu.be/RIHzmLv05DA> -->
546
-
<!-- o Whiteboard notes at <https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2024/NotesNovember25.pdf> -->
547
-
<p>c. Video on Decision trees <aclass="reference external" href="https://www.youtube.com/watch?v=RmajweUFKvM&amp;ab_channel=Simplilearn">https://www.youtube.com/watch?v=RmajweUFKvM&ab_channel=Simplilearn</a></p>
548
-
<p>d. Video on boosting methods <aclass="reference external" href="https://www.youtube.com/watch?v=wPqtzj5VZus&amp;ab_channel=H2O.ai">https://www.youtube.com/watch?v=wPqtzj5VZus&ab_channel=H2O.ai</a></p>
549
-
<p>e. Video on AdaBoost <aclass="reference external" href="https://www.youtube.com/watch?v=LsK-xG1cLYA">https://www.youtube.com/watch?v=LsK-xG1cLYA</a></p>
550
-
<p>f. Video on Gradient boost, part 1, parts 2-4 follow thereafter <aclass="reference external" href="https://www.youtube.com/watch?v=3CC4N4z3GJc">https://www.youtube.com/watch?v=3CC4N4z3GJc</a></p>
551
-
<p>g. Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <aclass="reference external" href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf">https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</a>.</p>
545
+
<p>c. Video of lecture at <aclass="reference external" href="https://youtu.be/iTaRdAPQnDA">https://youtu.be/iTaRdAPQnDA</a></p>
546
+
<p>d. Whiteboard notes at <aclass="github reference external" href="https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2024/NotesNovember25.pdf">CompPhysics/MachineLearning</a></p>
547
+
<p>e. Video on Decision trees <aclass="reference external" href="https://www.youtube.com/watch?v=RmajweUFKvM&amp;ab_channel=Simplilearn">https://www.youtube.com/watch?v=RmajweUFKvM&ab_channel=Simplilearn</a></p>
548
+
<p>f. Video on boosting methods <aclass="reference external" href="https://www.youtube.com/watch?v=wPqtzj5VZus&amp;ab_channel=H2O.ai">https://www.youtube.com/watch?v=wPqtzj5VZus&ab_channel=H2O.ai</a></p>
549
+
<p>g. Video on AdaBoost <aclass="reference external" href="https://www.youtube.com/watch?v=LsK-xG1cLYA">https://www.youtube.com/watch?v=LsK-xG1cLYA</a></p>
550
+
<p>h. Video on Gradient boost, part 1, parts 2-4 follow thereafter <aclass="reference external" href="https://www.youtube.com/watch?v=3CC4N4z3GJc">https://www.youtube.com/watch?v=3CC4N4z3GJc</a></p>
551
+
<p>i. Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <aclass="reference external" href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf">https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</a>.</p>
552
552
</section>
553
553
<sectionid="lab-sessions">
554
554
<h2>Lab sessions<aclass="headerlink" href="#lab-sessions" title="Link to this heading">#</a></h2>
@@ -651,17 +651,17 @@ <h2>Random Forests Compared with other Methods on the Cancer Data<a class="heade
651
651
(143, 30)
652
652
Test set accuracy Logistic Regression with scaled data: 0.96
653
653
Test set accuracy SVM with scaled data: 0.96
654
-
Test set accuracy with Decision Trees and scaled data: 0.91
654
+
Test set accuracy with Decision Trees and scaled data: 0.92
<divclass="output stderr highlight-myst-ansi notranslate"><divclass="highlight"><pre><span></span>/Users/mhjensen/miniforge3/envs/myenv/lib/python3.9/site-packages/sklearn/ensemble/_gb.py:424: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
0 commit comments