You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The indices tensor for each sentence is repeated along axis=0.
indices tensor ->
Sentence 1 (1D) occurs #epoch times
Sentence 2 (1D) occurs #epoch times
and so on
Then subsample() func is called on each 1D element of indices tensor inside dataset = dataset.map(lambda indices, progress: (subsample(indices, keep_probs), progress))
Subsampling is done using random variables which compare with keepProb list (keeping probabilities of words)
The way this is implemented, each epoch will have different versions of subsampled indices tensor for sentences because of the random nature of subsampling
Somehow the tf.repeat(indices) should be done after subsampling each sentence so that a subsampled sentence is repeated #epoch times instead of each epoch containing different versions of the same sentence.
The text was updated successfully, but these errors were encountered:
The indices tensor for each sentence is repeated along axis=0.
indices tensor ->
Sentence 1 (1D) occurs #epoch times
Sentence 2 (1D) occurs #epoch times
and so on
Then subsample() func is called on each 1D element of indices tensor inside
dataset = dataset.map(lambda indices, progress: (subsample(indices, keep_probs), progress))
Subsampling is done using random variables which compare with keepProb list (keeping probabilities of words)
The way this is implemented, each epoch will have different versions of subsampled indices tensor for sentences because of the random nature of subsampling
Somehow the tf.repeat(indices) should be done after subsampling each sentence so that a subsampled sentence is repeated #epoch times instead of each epoch containing different versions of the same sentence.
The text was updated successfully, but these errors were encountered: