Coders Packet

Real-Time Seven Emotions Percentage Calculation using Python, HTML, CSS , and JavaScript

By Ashish Kumar

A sequential model from deep learning is used to detect the user's emotions using real-time scanned faces from the live camera.

Emotion is a very important factor for every human being. A successful person knows the importance of emotion very well. It is also seen that there is a linkage between music and emotion. It is used in medical science and several other fields also.

So, we have designed a "Real-Time Seven Emotions Percentage Calculation using HTML, CSS, and JavaScript". A sequential model from deep learning is used to detect the user's emotions using real-time scanned faces from the live camera. Here, we scan the face using a live camera and Haarcasscade classifier. The Saved Sequential model is used to detect the emotion from the scanned face and the percentages of the seven emotions are updated immediately. HTML and CSS are used to design the user interface. JavaScript is used mainly to produce asynchronous calls and to update the percentages of the seven emotions. Python is used mainly to load and execute the deep learning model.

Programming Language used?

Python

JavaScript

HTML

CSS

How to run the code?

First of all download and unzip the attached file. Then, download the file from the link "https://drive.google.com/file/d/18D6HOT1bsHUEGnkSnG8l-C0MSdlexEJm/view?usp=sharing". Move the downloaded file from the link to the fake_emotion folder. Then, open the Anaconda Powershell Prompt and go to the fake_emotion folder, and type the command "python vedeo_pred.py" to run the system. 

Explanation of Important Part of the Used Code: -

import cv2
import numpy as np
import argparse
from keras.models import model_from_json
from keras.preprocessing import image
import eel
import os
import requests
import time
# Load model from JSON file
eel.init('user_interface')

neutral=1;
happiness=0;
surprise=0;
sadness=0;
anger=0;
disgust=0;
fear=0;
@eel.expose
def detect_emotion():
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
model = model_from_json(loaded_model_json)

# Load weights and them to model
model.load_weights('model.h5')

face_haar_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
cap = cv2.VideoCapture(0)
while True:
ret, img = cap.read()
if not ret:
break

gray_img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
faces_detected = face_haar_cascade.detectMultiScale(gray_img, 1.1, 6, minSize=(150, 150))

for (x, y, w, h) in faces_detected:
cv2.rectangle(img, (x, y), (x + w, y + h), (0, 255, 0), thickness=2)
roi_gray = gray_img[y:y + w, x:x + h]
roi_gray = cv2.resize(roi_gray, (48, 48))
img_pixels = image.img_to_array(roi_gray)
img_pixels = np.expand_dims(img_pixels, axis=0)
img_pixels /= 255.0

predictions = model.predict(img_pixels)
max_index = int(np.argmax(predictions))
#print(max_index)
emotions = ['neutral', 'happiness', 'surprise', 'sadness', 'anger', 'disgust', 'fear']
global predicted_emotion
predicted_emotion = emotions[max_index]

cv2.putText(img, predicted_emotion, (int(x), int(y)), cv2.FONT_HERSHEY_SIMPLEX, 0.75, (255, 255, 255), 2)

resized_img = cv2.resize(img, (1000, 700))
cv2.imshow('Facial Emotion Recognition', resized_img)
return predicted_emotion
if cv2.waitKey(1) & 0xFF == ord('q'):
breakprint(predicted_emotion)

eel.start('main.html')

The above-mentioned Python code is used to load and scan the face. Also, emotion is predicted and a window will show the image of the scanned face having a rectangle with predicted emotion in text format. Also, it will start a server using the main.html file.

 

let emotion="";
let data="";
let status=true;
neutral=0;
happiness=0;
surprise=0;
sadness=0;
anger=0;
disgust=0;
fear=0;
total=0;
async function run(){
while(status){
emotion=await eel.detect_emotion()();
if(emotion=='neutral')
{
neutral=neutral+1;
}
else if(emotion=='happiness')
{
happiness=happiness+1;
}
else if(emotion=='surprise')
{
surprise=surprise+1;
}
else if(emotion=='sadness')
{
sadness=sadness+1;
}
else if(emotion=='anger')
{
anger=anger+1;
}
else if(emotion=='disgust')
{
disgust=disgust+1;
}
else if(emotion=='fear')
{
fear=fear+1;
}
total=neutral+happiness+surprise+sadness+anger+disgust+fear;
document.getElementById('detected_emotion').innerHTML="Emotion detected most recently => "+emotion;
if(total>0){
document.getElementById('neutral').innerHTML="Neutral % ="+(neutral*100)/total;
document.getElementById('happiness').innerHTML="Happiness % ="+(happiness*100)/total;
document.getElementById('surprise').innerHTML="Surprise % ="+(surprise*100)/total;
document.getElementById('sadness').innerHTML="Sadness % ="+(sadness*100)/total;
document.getElementById('anger').innerHTML="Anger % ="+(anger*100)/total;
document.getElementById('disgust').innerHTML="Disgust % ="+(disgust*100)/total;
document.getElementById('fear').innerHTML="Fear % ="+(fear*100)/total;
}
sleep(1000);
}
}
function pause(){
if(status===true)
{
status=false;
}
else{
status=true;
run();
}
}
function sleep(milliseconds) {
const date = Date.now();
let currentDate = null;
do {
currentDate = Date.now();
} while (currentDate - date < milliseconds);
}

The above-mentioned JavaScript code is used to calculate the percentage of seven emotions in real-time and to manage asynchronous callbacks.

HTML and CSS codes used are self-explanatory.

Thank You! 

 

Download Complete Code

Comments

No comments yet