Skip to content

Use decorator for initializer #1

@suriyadeepan

Description

@suriyadeepan

Most of the components of neural networks, written as separate functions, require learnable parameters (tf.get_variable). These parameters need to be initialized inside the function. Typically initializer (uniform/normal/..) are architecture specific, but the functions/moduels themselves are quite common to different architectures.

I propose to define a decorator function @init which can be defined in model/model_name.py, so that the functions can make dynamically use different initializers when called.

Is this too much work? We could just pass the initializer as a parameter to function?

thoughts?

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions