Skip to content

Is it not possible to load transformer model on cpu only #286

@Lianowar

Description

@Lianowar

What happened?

I need to run inference of BERT4Rec on a CPU-only instance, but I can't.
When I try to load the fitted model, I get a PyTorch error.

RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

This occurs in __setstate__ at rectools/models/nn/transformers/base.py:596, and it’s currently impossible to pass the map_location parameter when loading the model.
Is exists some workaround there?

Expected behavior

No response

Relevant logs and/or screenshots

No response

Operating System

ubuntu

Python Version

3.11

RecTools version

0.13.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggood first issueGood for newcomers

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions