Emotion Detection from Video using YOLOv8 and ResNet18
This project detects human emotions from uploaded videos using a combination of YOLOv8 for face detection and a ResNet18-based CNN classifier trained on the FER2013 dataset.
๐ Features
Detect faces in videos using YOLOv8.
Predict emotions like happy, sad, fear, surprise, etc., using a ResNet18 model.
Upload and visualize the results via a clean Streamlit web UI.
Export processed videos with annotated face boxes and predicted emotions.
๐ง Emotions Detected
Angry
Disgust
Fear
Happy
Sad
Surprise
Neutral
๐๏ธ Project Structure
emotion_yolo_project/ โโโ models/ โ โโโ yolov8n-face.pt # YOLOv8 face detection model โ โโโ resnet18_emotion.pth # Trained ResNet18 model for emotion classification โโโ datasets/ # FER2013 dataset (organized by class folders) โโโ src/ โ โโโ emotion_model.py # Emotion classification logic โ โโโ inference_pipeline.py # Face detection + emotion pipeline โ โโโ train_emotion_model.py # Training script for ResNet18 โโโ streamlit_app.py # Streamlit UI code โโโ requirements.txt # Python dependencies โโโ Dockerfile # Docker build instructions
๐ง Installation
Option 1: Run Locally
# Clone the repository git clone https://github.com/adityawalture/emotion_yolo_project.git cd emotion_yolo_project
# Create virtual environment python -m venv venv source venv/bin/activate # or venv\Scripts\activate on Windows