Model sizes #615
Replies: 1 comment 1 reply
-
@christophemenager, indeed downloading the model after the app launch sounds like the best idea. In Private Mind, app backed by React Native ExecuTorch models are downloaded on user demand, please check this out: https://github.com/software-mansion-labs/private-mind (you can download this app on ios and android). I think this approach is currently the best one as you can release resources by deleting the model once not used anymore. Currently the biggest bottleneck is the RAM memory occupied by the model, not the size of the model itself (but they are strictly correlated). That's why there are many pre-exported models - to give you the ability to choose one that fits targeted devices. If there are small models that are on your mind don't hesitate and fill up the issue with a request :) Sometimes it takes some time to export such models because e.g. some operators are not implemented by ExecuTorch, but at least we are aware of them |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
This library sounds absolutely amazing but there is something I don't understand : the size of the models.
Almost all compatible models listed here has more than 100Mb, many of them are above 1Gb.
How do you embed such models in your react-native app? Do you download the model afterward at app launch?
Is it realistic to make the user download a 1gb or 2gb model on their phone?
Note : big up to the creators of this lib, I am just surprised not see many small models that could easily be embedded in a mobile app :)
Beta Was this translation helpful? Give feedback.
All reactions