Skip to content

[BUG] TFT: Target Included in Historic Encoder #1820

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
yesteryearer opened this issue Apr 23, 2025 · 4 comments
Closed

[BUG] TFT: Target Included in Historic Encoder #1820

yesteryearer opened this issue Apr 23, 2025 · 4 comments
Labels
bug Something isn't working

Comments

@yesteryearer
Copy link

I'm busy implementing a TFT for a use case where the target variable will be unavailable during inference time. At first I assumed that if the target is excluded from the time_varying_unknown_reals that the past encoder doesn't consider it all, but upon further reading, it seems that the target variable is handled differently.

Is there any clean way to configure the TFT or TimeSeriesDataset to explicitly make it avoid any notion of the past targets? In my use-case specifically we want to only predict one step into the future.

I also found this thread for which I can't deduce anything conclusive.

@yesteryearer yesteryearer added the bug Something isn't working label Apr 23, 2025
@github-project-automation github-project-automation bot moved this to Needs triage & validation in Bugfixing - pytorch-forecasting Apr 23, 2025
@fkiraly
Copy link
Collaborator

fkiraly commented Apr 30, 2025

FYI @jdb78, any insights?

@yesteryearer
Copy link
Author

yesteryearer commented Apr 30, 2025

Hi @fkiraly, the conclusion I've come to through extensive testing is that even though you are forced to have non-NaN values for your target variable when instantiating the dataloader for inference, if you don't have the target variable assigned explicitly under time_varying_unknown_reals, then the model won't use this for inference. Simply padding with zero works fine.

It would be grand if I could have an absolute confirmation of this though.

@jdb78
Copy link
Collaborator

jdb78 commented May 1, 2025

The target variable is generally not used in inference but only for training and evaluation. If you add it separately to unknown reals, it will use historic data but make a cutoff for inference.

There is one exception and that is normalization. So make sure to use a prefitted target normalizer or the encodernormalizer. That stuff happens automatically if you use construct your test dataset from the training dataset with .from_dataset

@yesteryearer
Copy link
Author

yesteryearer commented May 1, 2025

Hi @jdb78, thanks for the confirmation.

On your comment, If I use a target normalisation, will this effectively then be 'baked in' to the model or would I need to supply the normalisation weights using historic data? For instance, if I'm doing inference with a model trained using GroupNormalizer, is this normalisation then associated with the specific group and baked in or will it be determined based on the new input data — assuming all group ids are known?

As a side note, funnily enough I've noticed for my use case that not using any normalisation actual results in better peformance. Using these deep learning packages can be quite opaque as not all functionality is clearly defined in the docs.

@github-project-automation github-project-automation bot moved this from Needs triage & validation to Fixed/resolved in Bugfixing - pytorch-forecasting May 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Fixed/resolved
Development

No branches or pull requests

3 participants