Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could you tell me why LSTM_MAX_LENGTH is 1 in HM-16.5_TEST_LDP? #11

Open
Liam-zzZ opened this issue May 16, 2019 · 1 comment
Open

Could you tell me why LSTM_MAX_LENGTH is 1 in HM-16.5_TEST_LDP? #11

Liam-zzZ opened this issue May 16, 2019 · 1 comment

Comments

@Liam-zzZ
Copy link

Hi,
in train mode, LSTM_MAX_LENGTH is 20. But in HM-16.5_TEST_LDP, LSTM_MAX_LENGTH is 1 . Could you tell me why LSTM_MAX_LENGTH is different?

@tianyili2017
Copy link
Owner

Hi,
Sorry for late reply.
In train mode, the 20 residual CTUs on all time steps together, are fed into the LSTM structure at one time. So, LSTM_MAX_LENGTH = 20.
However, in test mode, the frames are encoded one by one, that is to say, only if the current frame is encoded, the residual CTUs for the next frames can be available. The 20 residual CTUs from 20 frames can not be obtained together at once, and thus the LSTM should be run step-by-step. Here, on each step, only one LSTM cell needs to be run forward, whose input contains one residual CTU and the LSTM state for the last frame, and the output contains the splitting probability of 21 CUs (HCPM in paper) and the updated LSTM state for the current frame. The above procedure explains why LSTM_MAX_LENGTH = 1 in test mode.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants