Skip to content

Users can either upload an image or use their webcam to detect facial expressions directly in the browser.

License

Notifications You must be signed in to change notification settings

Web3snipe/facial-emotion-recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Facial Emotion Recognition using face-api.js

Overview

This project implements Facial Emotion Recognition using face-api.js, a JavaScript library built on TensorFlow.js. Users can either upload an image or use their webcam to detect facial expressions directly in the browser. The detected emotions are displayed on the HTML canvas in real-time.

The project is simple and runs entirely in the browser, with no need for backend services or model training. Models are fetched from face-api.js's online repository, making it lightweight and easy to set up. Features

Real-Time Emotion Detection: Recognizes emotions using a webcam stream or uploaded images.
Face Detection: Detects faces in the browser and overlays emotion labels.
Multiple Emotions Supported: Detects happiness, sadness, anger, surprise, and neutral expressions.
Client-Side Processing: No server needed; runs entirely on the client-side using JavaScript.

Table of Contents

Installation
Usage
How It Works
File Structure
Technologies Used
Contributing
License

Installation Prerequisites

A modern web browser (Google Chrome, Firefox, etc.)
Basic understanding of JavaScript, HTML, and face-api.js.

Steps to Run Locally

Clone the repository:

bash

git clone https://github.com/Web3snipe/facial-emotion-recognition.git
cd facial-emotion-recognition-face-api

Open index.html in a browser: Simply open the index.html file in your preferred web browser to start using the application.

Internet connection required: The project fetches pre-trained models from GitHub. Ensure you have an active internet connection when running the application.

Usage Upload an Image for Emotion Detection

On the webpage, click the "Upload Image" button.
Select an image that contains a face.
The system will detect the face and display the predicted emotions on the image canvas.

Real-Time Webcam Emotion Detection

Click the "Start Webcam" button.
Grant permission to access your webcam.
The system will detect your face in real-time and display the emotions on the canvas overlay.

Supported Emotions

The system can classify and display the following emotions:

Happy 😀
Sad 😢
Angry 😡
Surprised 😮
Neutral 😐

How It Works

This application uses face-api.js to load pre-trained models for facial detection and emotion recognition. The following steps outline the main functionality:

Face Detection: face-api.js detects faces in the uploaded image or webcam stream.
Expression Analysis: The facial expression recognition model analyzes detected faces and classifies them into one of several emotion categories.
Visualization: The detected emotions are overlaid onto the image or video stream using an HTML canvas.

Key Functions in JavaScript (main.js):

Model Loading:

javascript

faceapi.nets.tinyFaceDetector.loadFromUri('/models'),
faceapi.nets.faceExpressionNet.loadFromUri('/models')

Models for face detection and emotion recognition are loaded from a public GitHub repository.

Image Upload: When an image is uploaded, the face is detected, and the emotion is analyzed and displayed on the canvas.

Webcam Stream: The webcam feed is processed frame-by-frame to detect facial expressions in real-time.

File Structure

Here is the file structure for this project:

facial-emotion-recognition-face-api/ │ ├── models/ # Folder to hold face-api models fetched from GitHub ├── index.html # Main HTML page with buttons to upload an image or start the webcam ├── main.js # JavaScript for face detection and emotion recognition ├── styles.css # CSS for styling the page (optional) └── READme.md # Project documentation (this file)

index.html

This is the main entry point for the project. It contains buttons to start the webcam or upload images, and renders the HTML canvas for displaying emotions. main.js

Handles the logic for loading models, detecting faces, and classifying emotions. It also contains functions to capture the webcam feed or uploaded image, process it, and update the HTML canvas with emotion predictions.

javascript

// Load face-api models from GitHub Promise.all([ faceapi.nets.tinyFaceDetector.loadFromUri('/models'), faceapi.nets.faceExpressionNet.loadFromUri('/models') ]).then(enableInteraction);

// Functionality for webcam or image-based emotion recognition...

styles.css

(Optional) You can customize the appearance of the page by updating the CSS file. Technologies Used

face-api.js: JavaScript library for facial recognition and emotion detection.
TensorFlow.js: Used behind the scenes by face-api.js for model handling.
HTML5: Structure of the webpage.
JavaScript: Handles the logic for face detection and emotion recognition.
CSS: (Optional) Used for styling the webpage.

Contributing

Contributions are welcome! Here’s how you can get involved:

Fork the repository.
Create a new branch for your feature (git checkout -b new-feature).
Commit your changes (git commit -m 'Add new feature').
Push to the branch (git push origin new-feature).
Open a pull request.

Feel free to raise issues for bugs, feature requests, or suggestions. License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Users can either upload an image or use their webcam to detect facial expressions directly in the browser.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published