Mindbot is an experimental prototype that integrates brain-computer interface (BCI) technology with an RC car, enabling control through EEG signals. The project combines an EEG headband, an ESP32-controlled RC car, a camera stream, and a VR headset for immersive control and feedback.
---- Brain-Controlled RC Car: Use EEG signals from a Muse 2 headband to control the movement of an RC car.
- Real-Time Camera Stream: View the car's perspective through a live video feed streamed via WebSocket.
- VR Integration: Enhance the experience with a VR headset for immersive control.
- Modular Design: The project is split into multiple components, allowing for flexibility and future expansion.
- Muse 2 EEG headband
- RC car with an ESP32 microcontroller
- ESP32-CAM module for video streaming
- VR headset (optional)
- Python 3.8+
- Arduino IDE for ESP32 programming
- Required Python libraries (see below)
- Open the ESPcar.ino file in the Arduino IDE.
- Set your WiFi credentials in the
ssidandpasswordvariables. - Upload the code to your ESP32.
- Connect the ESP32 to the RC car's motor driver.
- Open the ESPstream.ino file in the Arduino IDE.
- Set your WiFi credentials in the
ssidandpasswordvariables. - Upload the code to your ESP32-CAM.
- Ensure the ESP32-CAM is powered and connected to the same network.
Run the following command to install the required Python libraries:
pip install -r requirements.txt- Connect your Muse 2 EEG headband.
- In a terminal, run:
muselsl stream
- (Optional) To visualize the EEG data, run in another terminal:
muselsl view --version 2
- Open the BCI_predict.py file and set the
ESP32_IPvariable to the IP address of your ESP32 controlling the RC car. - Run the script:
python BCI_predict.py
- ESP32 RC Car Control: The ESP32 serves as a WebSocket server, receiving commands (
forward,reverse,left,right,stop) to control the car's motors. - ESP32-CAM Video Stream: The ESP32-CAM streams live video to a web interface, providing a first-person view of the car's surroundings.
- EEG Signal Processing: The Muse 2 headband streams EEG data, which is processed in real-time using a trained neural network (
EEGNet) to predict user intentions. - Command Transmission: Predicted commands are sent to the ESP32 controlling the RC car.
- Power on the RC car and ESP32-CAM.
- Start the EEG stream using
muselsl. - Launch the BCI_predict.py script to process EEG data and control the car.
- Open the ESP32-CAM's web interface in a browser to view the live video feed.
- (Optional) Use a VR headset for an immersive experience.
- Prototype State: The project is in an unfinished prototype stage and may require manual adjustments.
- Multiple Terminals: Requires 2-3 terminals to run different components.
- Limited Accuracy: The EEG-based control relies on a trained neural network and may require further tuning for better accuracy.
- Integrate all components into a single streamlined interface.
- Improve the accuracy of EEG signal classification.
- Add support for additional EEG devices.
- Enhance the VR experience with real-time feedback.
Mindbot/
├── script/
│ ├── main.py # Main script for recording EEG data
│ ├── ESPcar/ESPcar.ino # ESP32 code for RC car control
│ ├── ESPstream/ESPstream.ino # ESP32-CAM code for video streaming
│ ├── EEGNet_Training.ipynb # Notebook for training the EEGNet model
├── BCI_predict.py # Script for real-time EEG prediction and car control
├── .gitignore # Git ignore file
└── README.md # Project documentation
This project is open-source and available under the MIT License. Feel free to contribute and improve the project!
