Conversation
|
Rather than issues, I'd be looking for feedback on my implementation :) |
|
Hey. Nice you got it wrapped up. I have two comments on this.
|
|
Regarding 1., I agree but I am unsure about the way forward. Technically, I reused as much as I could from the Trainer class. Most of the duplicate code directly comes from the train script. Maybe the Trainer class could have a setup(settings) method though, which would even more reduce the need for duplication around. As for 2., I technically deal with nested using path ("lr/patience"). I did not include example though. I'll look at your implementation for this. |
|
I think something along the following line would be neat: trainer, trainset, devset, encoder, models = Trainer.setup(settings) |
|
I reworked the API to go towards a more unified training API. If you agree with it, I'll apply it to other scripts. |
pie/default_optuna.json
Outdated
| // { | ||
| // "path": "cemb_dim", | ||
| // "type": "suggest_int", | ||
| // "args": [200, 600] |
There was a problem hiding this comment.
Doesn't optuna allow to define sampling distributions for the parameters?
There was a problem hiding this comment.
The way I designed it, you can basically use any optuna distribution function. It just makes it simpler as you don't have to "code" it.
I have not yet implemented the GridSampler and stuff like this. Should probably be the next target... https://optuna.readthedocs.io/en/stable/reference/samplers.html?highlight=suggest#optuna.samplers.GridSampler
|
Actually better to use "devices" because it wouldn't have to be a cuda
device after all.
…On Sat, Jun 27, 2020 at 5:44 PM Thibault Clérice ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In pie/scripts/tune.py
<#67 (comment)>:
> +def create_tuna_optimization(trial: optuna.Trial, fn: str, name: str, value: List[Any]):
+ """ Generate tuna value generator
+
+ Might use self one day, so...
+
+ :param trial:
+ :param fn:
+ :param name:
+ :param value:
+ :return:
+ """
+ return getattr(trial, fn)(name, *value)
+
+
+class Optimizer(object):
+ def __init__(self, settings, optimization_settings: List[Dict[str, Any]], gpus: List[int] = []):
I can definitely use cudas (plural).
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#67 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABPIPIZWJEPD5WDCPMKKSZTRYYHWJANCNFSM4OJE4R5Q>
.
--
Enrique Manjavacas
|
|
Thanks for the review ;) |
4b88e17 to
c6b5b48
Compare
01ab1c5 to
e49c914
Compare
|
Ok, this should be again seen by you. I addressed all your concerns (I think). This should be neat :) |
… reset settings Trial would not reset Model settings used for model training, still thinking it did, hence training infinitely on the same first params
…ining when not using optuna
|
All in all, I think this is one is ready |
It's a work in progress for now, I'll need to implement multi-GPUs but I have not access to it right now.