Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add a seed argument to some adjustments #69

Open
topepo opened this issue Feb 19, 2025 · 0 comments
Open

add a seed argument to some adjustments #69

topepo opened this issue Feb 19, 2025 · 0 comments

Comments

@topepo
Copy link
Member

topepo commented Feb 19, 2025

Working with adjust_probability_calibration() and I get slightly different calibration results across tuning parameters:

# A tibble: 9 × 7
  threshold .metric     .estimator   mean     n std_err .config        
      <dbl> <chr>       <chr>       <dbl> <int>   <dbl> <chr>          
1       0   roc_auc     binary     0.712     10 0.0216  pre0_mod0_post1
2       0   sensitivity binary     1         10 0       pre0_mod0_post1
3       0   specificity binary     0         10 0       pre0_mod0_post1
4       0.5 roc_auc     binary     0.710     10 0.0212  pre0_mod0_post2
5       0.5 sensitivity binary     0.195     10 0.0213  pre0_mod0_post2
6       0.5 specificity binary     0.969     10 0.00612 pre0_mod0_post2
7       1   roc_auc     binary     0.710     10 0.0229  pre0_mod0_post3
8       1   sensitivity binary     0.0248    10 0.0184  pre0_mod0_post3
9       1   specificity binary     0.996     10 0.00297 pre0_mod0_post3

This isn't a big deal, but it could confuse users.

I suggest adding an argument seed = sample.int(10^4, 1) for adjustments that use random numbers. This will be evaluated when the tailor is made and using withr we can fix the stream when the adjustment is trained.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant