Skip to content

Commit 194e049

Browse files
committed
drop mtry parameter for bonus example, adjust naming of learner according to #161
1 parent f08b386 commit 194e049

File tree

1 file changed

+9
-8
lines changed

1 file changed

+9
-8
lines changed

vignettes/getstarted.Rmd

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ dml_data_bonus = DoubleMLData$new(df_bonus,
7979
print(dml_data_bonus)
8080
8181
# matrix interface to DoubleMLData
82-
dml_data_sim = double_ml_data_from_matrix(X=X, y=y, d=d)
82+
dml_data_sim = double_ml_data_from_matrix(X = X, y = y, d = d)
8383
dml_data_sim
8484
```
8585

@@ -94,12 +94,12 @@ library(mlr3learners)
9494
# surpress messages from mlr3 package during fitting
9595
lgr::get_logger("mlr3")$set_threshold("warn")
9696
97-
learner = lrn("regr.ranger", num.trees=500, mtry=floor(sqrt(n_vars)), max.depth=5, min.node.size=2)
98-
ml_g_bonus = learner$clone()
97+
learner = lrn("regr.ranger", num.trees = 500, max.depth = 5, min.node.size = 2)
98+
ml_l_bonus = learner$clone()
9999
ml_m_bonus = learner$clone()
100100
101101
learner = lrn("regr.glmnet", lambda = sqrt(log(n_vars)/(n_obs)))
102-
ml_g_sim = learner$clone()
102+
ml_l_sim = learner$clone()
103103
ml_m_sim = learner$clone()
104104
```
105105

@@ -111,9 +111,10 @@ When initializing the object for PLR models `DoubleMLPLR`, we can further set pa
111111
* The number of folds used for cross-fitting `n_folds` (defaults to `n_folds = 5`) as well as
112112
* the number of repetitions when applying repeated cross-fitting `n_rep` (defaults to `n_rep = 1`).
113113

114-
Additionally, one can choose between the algorithms `"dml1"` and `"dml2"` via `dml_procedure` (defaults to `"dml2"`). Depending on the causal model, one can further choose between different Neyman-orthogonal score / moment functions. For the PLR model the default score is `"partialling out"`.
114+
Additionally, one can choose between the algorithms `"dml1"` and `"dml2"` via `dml_procedure` (defaults to `"dml2"`). Depending on the causal model, one can further choose between different Neyman-orthogonal score / moment functions. For the PLR model the default score is `"partialling out"`, i.e.,
115+
\begin{align}\begin{aligned}\psi(W; \theta, \eta) &:= [Y - \ell(X) - \theta (D - m(X))] [D - m(X)].\end{aligned}\end{align}
115116

116-
The user guide provides details about the Sample-splitting, cross-fitting and repeated cross-fitting, the Double machine learning algorithms and the Score functions
117+
Note that with this score, we do not estimate $g_0(X)$ directly, but the conditional expectation of $Y$ given $X$, $\ell_0(X) = E[Y|X]$. The user guide provides details about the Sample-splitting, cross-fitting and repeated cross-fitting, the Double machine learning algorithms and the Score functions
117118

118119

119120
## Estimate double/debiased machine learning models
@@ -122,11 +123,11 @@ We now initialize `DoubleMLPLR` objects for our examples using default parameter
122123

123124
```{r}
124125
set.seed(3141)
125-
obj_dml_plr_bonus = DoubleMLPLR$new(dml_data_bonus, ml_g=ml_g_bonus, ml_m=ml_m_bonus)
126+
obj_dml_plr_bonus = DoubleMLPLR$new(dml_data_bonus, ml_l = ml_l_bonus, ml_m = ml_m_bonus)
126127
obj_dml_plr_bonus$fit()
127128
print(obj_dml_plr_bonus)
128129
129-
obj_dml_plr_sim = DoubleMLPLR$new(dml_data_sim, ml_g=ml_g_sim, ml_m=ml_m_sim)
130+
obj_dml_plr_sim = DoubleMLPLR$new(dml_data_sim, ml_l = ml_l_sim, ml_m = ml_m_sim)
130131
obj_dml_plr_sim$fit()
131132
print(obj_dml_plr_sim)
132133
```

0 commit comments

Comments
 (0)