You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/pub/week35/html/._week35-bs002.html
+3-4Lines changed: 3 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -323,13 +323,12 @@
323
323
<!-- !split -->
324
324
<h2id="reminder-from-last-week" class="anchor">Reminder from last week </h2>
325
325
326
-
<p>We need first a reminder from last week about linear regression. </p>
326
+
<p>We need first a reminder from last week about linear regression. We are going to fit a continuous function with linear parameterization in terms of the parameters \( \boldsymbol{\theta} \) and our first encounter is ordinary least squares.</p>
327
327
328
-
<p>Fitting a continuous function with linear parameterization in terms of the parameters \( \boldsymbol{\beta} \).</p>
329
328
<ul>
330
-
<li>Method of choice for fitting a continuous function!</li>
329
+
<li>It is the method of choice for fitting a continuous function</li>
331
330
<li> Gives an excellent introduction to central Machine Learning features with <b>understandable pedagogical</b> links to other methods like <b>Neural Networks</b>, <b>Support Vector Machines</b> etc</li>
332
-
<li> Analytical expression for the fitting parameters \( \boldsymbol{\beta} \)</li>
331
+
<li> Analytical expression for the fitting parameters \( \boldsymbol{\theta} \)</li>
333
332
<li> Analytical expressions for statistical propertiers like mean values, variances, confidence intervals and more</li>
334
333
<li> Analytical relation with probabilistic interpretations</li>
335
334
<li> Easy to introduce basic concepts like bias-variance tradeoff, cross-validation, resampling and regularization techniques and many other ML topics</li>
will treat \( y_i \) as our exact value for the response variable.
346
346
</p>
347
347
348
-
<p>In order to find the parameters \( \beta_i \) we will then minimize the spread of \( C(\boldsymbol{\beta}) \), that is we are going to solve the problem</p>
348
+
<p>In order to find the parameters \( \theta_i \) we will then minimize the spread of \( C(\boldsymbol{\theta}) \), that is we are going to solve the problem</p>
<p>we have five predictors/features. The first is the intercept \( \beta_0 \). The other terms are \( \beta_i \) with \( i=1,2,3,4 \). Furthermore we have \( n \) entries for each predictor. It means that our design matrix is an
334
+
<p>we have five predictors/features. The first is the intercept \( \theta_0 \). The other terms are \( \theta_i \) with \( i=1,2,3,4 \). Furthermore we have \( n \) entries for each predictor. It means that our design matrix is an
Copy file name to clipboardExpand all lines: doc/pub/week35/html/._week35-bs018.html
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -323,7 +323,7 @@
323
323
<!-- !split -->
324
324
<h2id="own-code-for-ordinary-least-squares" class="anchor">Own code for Ordinary Least Squares </h2>
325
325
326
-
<p>It is rather straightforward to implement the matrix inversion and obtain the parameters \( \boldsymbol{\beta} \). After having defined the matrix \( \boldsymbol{X} \) and the outputs \( \boldsymbol{y} \) we have </p>
326
+
<p>It is rather straightforward to implement the matrix inversion and obtain the parameters \( \boldsymbol{\theta} \). After having defined the matrix \( \boldsymbol{X} \) and the outputs \( \boldsymbol{y} \) we have </p>
327
327
328
328
<!-- code=python (!bc pycod) typeset with pygments style "default" -->
0 commit comments