Skip to content

Commit af974be

Browse files
committed
begin unsupervised learning
1 parent 900e245 commit af974be

File tree

2 files changed

+7
-2
lines changed

2 files changed

+7
-2
lines changed

tree-models.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,11 @@
88
```
99
1. For each predictor variabel Xj:
1010
a. For each value sj of Xj:
11-
i. Split the records in A wiht Xj values < sj as one partitino, amd the remaining records where Xj >= sj as another partition.
11+
i. Split the records in A wiht Xj values < sj as one partition, and the remaining records where Xj >= sj as another partition.
1212
ii. Measure the homogenity of classes within each partition of A.
1313
b. Select the value of sj that produces maximum within-partition homogenity of class.
1414
2. Select the variable Xj and split value sj that produces maximum within partition homogenity of class.
15-
***Now comes the recursive part***
15+
**Now comes the recursive part**
1616
1. Initialize A with the entire data set.
1717
2. Apply the parititoning algorithm to split A into two subparittionsm A1 and A2.
1818
3. Repeat step 2 on subparititons A1 and A2.

unsupervised-learning.md

+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Unsupervised Learning
2+
3+
- Statistical methods that extract meanning from data without training a model on labeled data.
4+
- Unsupervised learning can be used to create predictive rule in absence of a labled respone. The goal may be to reduce the dimension of the data to a more manageable set of variables. This reduced set can then be used as an input into a predictve model, such as regression or classification. unsupervided learning can also be viewed as an extension of the exploratary data analysis. The aim is to gain insight into a set of data and how the different variables relate to each other.
5+
- Unsupervised techniques allow you to sift through and analyze these variables and discover relationships.

0 commit comments

Comments
 (0)