Skip to content

Testing Android and iOS apps on OSS CI using Nova reusable mobile workflow

Huy Do edited this page Apr 18, 2024 · 14 revisions

With the advent of new tools like https://github.com/pytorch/executorch, it's now possible to run LLM inference locally on mobile devices using different models such as llama2. While it isn't hard to experiment with this new capability, test it out on your own devices, and see some results, it takes more efforts to automate this process and make it a part of the CI on various PyTorch-family repositories. To solve this challenge, PyTorch Dev Infra team are launching a new Nova reusable mobile workflow to do the heavy lifting for you when it comes to testing your mobile apps.

With this new reusable workflow, devs now can:

  1. Utilize our mobile infrastructure built on top of AWS Device Farm. It offers a wide variety of popular Android and iOS devices from phones to tablets.
  2. Write and run tests remotely on those devices like how you run them locally with your connected phones.
  3. Go beyond the emulator to stress test and benchmark your local LLM inference solutions on actual devices. This helps accurately answer the questions on how many token the solution could process per second and how much memory and power it needs.
  4. Debug hard-to-reproduce issues on devices that you don't have.
  5. Gather the results and share them with others via the familiar GitHub CI UX.

Quick Start

TODO