-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Labels
Description
Most of the components of neural networks, written as separate functions, require learnable parameters (tf.get_variable). These parameters need to be initialized inside the function. Typically initializer (uniform/normal/..) are architecture specific, but the functions/moduels themselves are quite common to different architectures.
I propose to define a decorator function @init which can be defined in model/model_name.py, so that the functions can make dynamically use different initializers when called.
Is this too much work? We could just pass the initializer as a parameter to function?
thoughts?