Skip to content

kbsriharsha/FacialEmotionAnalysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FacialEmotionAnalysis

This program helps in identifying a person's emotion from live stream. The model inspiration was taken a research paper, where the authors published a efficient network for facial and gender identification (https://arxiv.org/pdf/1710.07557.pdf)

Code Requirements

Just execute the below command from your terminal for installing all the required dependencies.

pip install requirement.txt

Model Description

This program uses the network named exception (https://arxiv.org/pdf/1710.07557.pdf), which is a combination of several residual blocks. I trained the network for 100 epochs which took me 48 hours on the MAC with 16gb ram. However I strongly recommend training your model on Google Colab using GPU's, where it took me 45 minutes. 56 hours to 45 min :D

Procedure for using the code

  • For training your model, run python training.py. However I strongly recommend using model I trained it for you.
  • For using the model on a live stream, run python RealtimeAnalysis.py where you can capture the live stream from the webcam and the model inference will be displayed.
  • Notes: Use the gpu version of the model for the trained model

References

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published