Skip to content

Latest commit

 

History

History
16 lines (12 loc) · 1.28 KB

File metadata and controls

16 lines (12 loc) · 1.28 KB

Train AI for Hand-to-Mouth Detection

Made to detect hand-to-mouth eating using accelerometer and gyroscope data, using the following general steps:

  1. Collect Data: Collect accelerometer and gyroscope data while performing hand-to-mouth eating actions as well as other non-eating activities. For the training of data,
  2. Preprocess Data: Annotate the data, segmenting it into eating and non-eating. The program will automatically normalize such data.
  3. The variant LSTM networks from Recurrent Neural Networks (RNNs) was used to train on the labeled data.
  4. Model Evaluation: The model's performance can be tested on a test set.
  5. Deploy Model: Use the trained model to detect hand-to-mouth eating in real-time.

To see how it impletmented, Run the motion_sensor_data_lstm.ipynb in Github codespace or Google Colab

To setup Github codespace install Python extension first, then run following command in terminal:
python3 -m pip install tensorflow[and-cuda]

Video demonstrating training of model

Video demonstrating trained model detecting the hand to mouth motion from new data