Skip to content

Commit df2f852

Browse files
committed
update
1 parent 04ddef2 commit df2f852

File tree

16 files changed

+220
-227
lines changed

16 files changed

+220
-227
lines changed

doc/pub/week47/html/._week47-bs001.html

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -367,7 +367,6 @@ <h2 id="plan-for-week-47" class="anchor">Plan for week 47 </h2>
367367
<li> Video on Gradient boost, part 1, parts 2-4 follow thereafter <a href="https://www.youtube.com/watch?v=3CC4N4z3GJc" target="_self"><tt>https://www.youtube.com/watch?v=3CC4N4z3GJc</tt></a></li>
368368
<li> Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_self"><tt>https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</tt></a>.</li>
369369
</ol>
370-
<li> Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_self">STK-IN4300, lecture 7</a>. Chapter 9.2 of Hastie et al contains also a good discussion.</li>
371370
</ol>
372371
</div>
373372
</div>

doc/pub/week47/html/week47-reveal.html

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -227,7 +227,6 @@ <h2 id="plan-for-week-47">Plan for week 47 </h2>
227227
<p><li> Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank"><tt>https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</tt></a>.</li>
228228
</ol>
229229
<p>
230-
<p><li> Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank">STK-IN4300, lecture 7</a>. Chapter 9.2 of Hastie et al contains also a good discussion.</li>
231230
</ol>
232231
</div>
233232
</section>

doc/pub/week47/html/week47-solarized.html

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -325,7 +325,6 @@ <h2 id="plan-for-week-47">Plan for week 47 </h2>
325325
<li> Video on Gradient boost, part 1, parts 2-4 follow thereafter <a href="https://www.youtube.com/watch?v=3CC4N4z3GJc" target="_blank"><tt>https://www.youtube.com/watch?v=3CC4N4z3GJc</tt></a></li>
326326
<li> Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank"><tt>https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</tt></a>.</li>
327327
</ol>
328-
<li> Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank">STK-IN4300, lecture 7</a>. Chapter 9.2 of Hastie et al contains also a good discussion.</li>
329328
</ol>
330329
</div>
331330

doc/pub/week47/html/week47.html

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -402,7 +402,6 @@ <h2 id="plan-for-week-47">Plan for week 47 </h2>
402402
<li> Video on Gradient boost, part 1, parts 2-4 follow thereafter <a href="https://www.youtube.com/watch?v=3CC4N4z3GJc" target="_blank"><tt>https://www.youtube.com/watch?v=3CC4N4z3GJc</tt></a></li>
403403
<li> Decision Trees: Rashcka et al chapter 3 pages 86-98, and chapter 7 on Ensemble methods, Voting and Bagging and Gradient Boosting. See also lecture from STK-IN4300, lecture 7 at <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank"><tt>https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf</tt></a>.</li>
404404
</ol>
405-
<li> Decision Trees: Geron's chapter 6 covers decision trees while ensemble models, voting and bagging are discussed in chapter 7. See also lecture from <a href="https://www.uio.no/studier/emner/matnat/math/STK-IN4300/h20/slides/lecture_7.pdf" target="_blank">STK-IN4300, lecture 7</a>. Chapter 9.2 of Hastie et al contains also a good discussion.</li>
406405
</ol>
407406
</div>
408407

Lines changed: 29 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,57 +1,57 @@
11
digraph Tree {
2-
node [shape=box, style="filled, rounded", color="black", fontname=helvetica] ;
3-
edge [fontname=helvetica] ;
4-
0 [label="worst perimeter <= 106.05\ngini = 0.465\nsamples = 426\nvalue = [[269, 157]\n[157, 269]]", fillcolor="#e5813908"] ;
5-
1 [label="worst concave points <= 0.159\ngini = 0.067\nsamples = 259\nvalue = [[250, 9]\n[9, 250]]", fillcolor="#e58139db"] ;
2+
node [shape=box, style="filled, rounded", color="black", fontname="helvetica"] ;
3+
edge [fontname="helvetica"] ;
4+
0 [label="worst perimeter <= 106.05\ngini = 0.465\nsamples = 426\nvalue = [[269, 157]\n[157, 269]]", fillcolor="#fefbf9"] ;
5+
1 [label="worst concave points <= 0.159\ngini = 0.067\nsamples = 259\nvalue = [[250, 9]\n[9, 250]]", fillcolor="#e99355"] ;
66
0 -> 1 [labeldistance=2.5, labelangle=45, headlabel="True"] ;
7-
2 [label="worst concave points <= 0.135\ngini = 0.031\nsamples = 253\nvalue = [[249, 4]\n[4, 249]]", fillcolor="#e58139ee"] ;
7+
2 [label="worst concave points <= 0.135\ngini = 0.031\nsamples = 253\nvalue = [[249, 4]\n[4, 249]]", fillcolor="#e78946"] ;
88
1 -> 2 ;
9-
3 [label="radius error <= 0.643\ngini = 0.008\nsamples = 242\nvalue = [[241, 1]\n[1, 241]]", fillcolor="#e58139fb"] ;
9+
3 [label="area error <= 48.975\ngini = 0.008\nsamples = 242\nvalue = [[241, 1]\n[1, 241]]", fillcolor="#e5833c"] ;
1010
2 -> 3 ;
11-
4 [label="gini = 0.0\nsamples = 239\nvalue = [[239, 0]\n[0, 239]]", fillcolor="#e58139ff"] ;
11+
4 [label="gini = 0.0\nsamples = 239\nvalue = [[239, 0]\n[0, 239]]", fillcolor="#e58139"] ;
1212
3 -> 4 ;
13-
5 [label="worst symmetry <= 0.208\ngini = 0.444\nsamples = 3\nvalue = [[2, 1]\n[1, 2]]", fillcolor="#e5813913"] ;
13+
5 [label="mean symmetry <= 0.166\ngini = 0.444\nsamples = 3\nvalue = [[2, 1]\n[1, 2]]", fillcolor="#fdf6f0"] ;
1414
3 -> 5 ;
15-
6 [label="gini = 0.0\nsamples = 1\nvalue = [[0, 1]\n[1, 0]]", fillcolor="#e58139ff"] ;
15+
6 [label="gini = 0.0\nsamples = 1\nvalue = [[0, 1]\n[1, 0]]", fillcolor="#e58139"] ;
1616
5 -> 6 ;
17-
7 [label="gini = 0.0\nsamples = 2\nvalue = [[2, 0]\n[0, 2]]", fillcolor="#e58139ff"] ;
17+
7 [label="gini = 0.0\nsamples = 2\nvalue = [[2, 0]\n[0, 2]]", fillcolor="#e58139"] ;
1818
5 -> 7 ;
19-
8 [label="worst texture <= 29.455\ngini = 0.397\nsamples = 11\nvalue = [[8, 3]\n[3, 8]]", fillcolor="#e581392c"] ;
19+
8 [label="mean texture <= 20.84\ngini = 0.397\nsamples = 11\nvalue = [[8, 3]\n[3, 8]]", fillcolor="#fae9dd"] ;
2020
2 -> 8 ;
21-
9 [label="gini = 0.0\nsamples = 8\nvalue = [[8, 0]\n[0, 8]]", fillcolor="#e58139ff"] ;
21+
9 [label="gini = 0.0\nsamples = 8\nvalue = [[8, 0]\n[0, 8]]", fillcolor="#e58139"] ;
2222
8 -> 9 ;
23-
10 [label="gini = 0.0\nsamples = 3\nvalue = [[0, 3]\n[3, 0]]", fillcolor="#e58139ff"] ;
23+
10 [label="gini = 0.0\nsamples = 3\nvalue = [[0, 3]\n[3, 0]]", fillcolor="#e58139"] ;
2424
8 -> 10 ;
25-
11 [label="mean texture <= 16.22\ngini = 0.278\nsamples = 6\nvalue = [[1, 5]\n[5, 1]]", fillcolor="#e581396b"] ;
25+
11 [label="worst texture <= 24.785\ngini = 0.278\nsamples = 6\nvalue = [[1, 5]\n[5, 1]]", fillcolor="#f4caac"] ;
2626
1 -> 11 ;
27-
12 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139ff"] ;
27+
12 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139"] ;
2828
11 -> 12 ;
29-
13 [label="gini = 0.0\nsamples = 5\nvalue = [[0, 5]\n[5, 0]]", fillcolor="#e58139ff"] ;
29+
13 [label="gini = 0.0\nsamples = 5\nvalue = [[0, 5]\n[5, 0]]", fillcolor="#e58139"] ;
3030
11 -> 13 ;
31-
14 [label="worst texture <= 20.645\ngini = 0.202\nsamples = 167\nvalue = [[19, 148]\n[148, 19]]", fillcolor="#e5813994"] ;
31+
14 [label="worst texture <= 20.645\ngini = 0.202\nsamples = 167\nvalue = [[19, 148]\n[148, 19]]", fillcolor="#f0b68c"] ;
3232
0 -> 14 [labeldistance=2.5, labelangle=-45, headlabel="False"] ;
33-
15 [label="worst radius <= 17.74\ngini = 0.375\nsamples = 16\nvalue = [[12, 4]\n[4, 12]]", fillcolor="#e5813938"] ;
33+
15 [label="worst concavity <= 0.318\ngini = 0.375\nsamples = 16\nvalue = [[12, 4]\n[4, 12]]", fillcolor="#f9e3d4"] ;
3434
14 -> 15 ;
35-
16 [label="gini = 0.0\nsamples = 11\nvalue = [[11, 0]\n[0, 11]]", fillcolor="#e58139ff"] ;
35+
16 [label="gini = 0.0\nsamples = 11\nvalue = [[11, 0]\n[0, 11]]", fillcolor="#e58139"] ;
3636
15 -> 16 ;
37-
17 [label="mean texture <= 13.745\ngini = 0.32\nsamples = 5\nvalue = [[1, 4]\n[4, 1]]", fillcolor="#e5813955"] ;
37+
17 [label="mean radius <= 15.06\ngini = 0.32\nsamples = 5\nvalue = [[1, 4]\n[4, 1]]", fillcolor="#f6d5bd"] ;
3838
15 -> 17 ;
39-
18 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139ff"] ;
39+
18 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139"] ;
4040
17 -> 18 ;
41-
19 [label="gini = 0.0\nsamples = 4\nvalue = [[0, 4]\n[4, 0]]", fillcolor="#e58139ff"] ;
41+
19 [label="gini = 0.0\nsamples = 4\nvalue = [[0, 4]\n[4, 0]]", fillcolor="#e58139"] ;
4242
17 -> 19 ;
43-
20 [label="mean concave points <= 0.049\ngini = 0.088\nsamples = 151\nvalue = [[7, 144]\n[144, 7]]", fillcolor="#e58139d0"] ;
43+
20 [label="mean concave points <= 0.049\ngini = 0.088\nsamples = 151\nvalue = [[7, 144]\n[144, 7]]", fillcolor="#ea985d"] ;
4444
14 -> 20 ;
45-
21 [label="concave points error <= 0.01\ngini = 0.48\nsamples = 15\nvalue = [[6, 9]\n[9, 6]]", fillcolor="#e5813900"] ;
45+
21 [label="concave points error <= 0.01\ngini = 0.48\nsamples = 15\nvalue = [[6, 9]\n[9, 6]]", fillcolor="#ffffff"] ;
4646
20 -> 21 ;
47-
22 [label="gini = 0.0\nsamples = 9\nvalue = [[0, 9]\n[9, 0]]", fillcolor="#e58139ff"] ;
47+
22 [label="gini = 0.0\nsamples = 9\nvalue = [[0, 9]\n[9, 0]]", fillcolor="#e58139"] ;
4848
21 -> 22 ;
49-
23 [label="gini = 0.0\nsamples = 6\nvalue = [[6, 0]\n[0, 6]]", fillcolor="#e58139ff"] ;
49+
23 [label="gini = 0.0\nsamples = 6\nvalue = [[6, 0]\n[0, 6]]", fillcolor="#e58139"] ;
5050
21 -> 23 ;
51-
24 [label="worst smoothness <= 0.096\ngini = 0.015\nsamples = 136\nvalue = [[1, 135]\n[135, 1]]", fillcolor="#e58139f7"] ;
51+
24 [label="mean smoothness <= 0.079\ngini = 0.015\nsamples = 136\nvalue = [[1, 135]\n[135, 1]]", fillcolor="#e6853f"] ;
5252
20 -> 24 ;
53-
25 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139ff"] ;
53+
25 [label="gini = 0.0\nsamples = 1\nvalue = [[1, 0]\n[0, 1]]", fillcolor="#e58139"] ;
5454
24 -> 25 ;
55-
26 [label="gini = 0.0\nsamples = 135\nvalue = [[0, 135]\n[135, 0]]", fillcolor="#e58139ff"] ;
55+
26 [label="gini = 0.0\nsamples = 135\nvalue = [[0, 135]\n[135, 0]]", fillcolor="#e58139"] ;
5656
24 -> 26 ;
5757
}
26.1 KB
Loading
Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
digraph Tree {
2-
node [shape=box, style="filled, rounded", color="black", fontname=helvetica] ;
3-
edge [fontname=helvetica] ;
4-
0 [label="X[7] <= 0.5\ngini = 0.48\nsamples = 15\nvalue = [4, 10, 1]", fillcolor="#39e5818b"] ;
5-
1 [label="X[1] <= 0.5\ngini = 0.408\nsamples = 14\nvalue = [4, 10, 0]", fillcolor="#39e58199"] ;
2+
node [shape=box, style="filled, rounded", color="black", fontname="helvetica"] ;
3+
edge [fontname="helvetica"] ;
4+
0 [label="x[13] <= 0.5\ngini = 0.48\nsamples = 15\nvalue = [4, 10, 1]", fillcolor="#93f1ba"] ;
5+
1 [label="x[1] <= 0.5\ngini = 0.408\nsamples = 14\nvalue = [4, 10, 0]", fillcolor="#88efb3"] ;
66
0 -> 1 [labeldistance=2.5, labelangle=45, headlabel="True"] ;
7-
2 [label="gini = 0.48\nsamples = 10\nvalue = [4, 6, 0]", fillcolor="#39e58155"] ;
7+
2 [label="gini = 0.48\nsamples = 10\nvalue = [4, 6, 0]", fillcolor="#bdf6d5"] ;
88
1 -> 2 ;
9-
3 [label="gini = 0.0\nsamples = 4\nvalue = [0, 4, 0]", fillcolor="#39e581ff"] ;
9+
3 [label="gini = 0.0\nsamples = 4\nvalue = [0, 4, 0]", fillcolor="#39e581"] ;
1010
1 -> 3 ;
11-
4 [label="gini = 0.0\nsamples = 1\nvalue = [0, 0, 1]", fillcolor="#8139e5ff"] ;
11+
4 [label="gini = 0.0\nsamples = 1\nvalue = [0, 0, 1]", fillcolor="#8139e5"] ;
1212
0 -> 4 [labeldistance=2.5, labelangle=-45, headlabel="False"] ;
1313
}
350 Bytes
Loading
1.39 KB
Loading
152 Bytes
Loading

0 commit comments

Comments
 (0)