Live Integrated Facial Observation (L.I.F.O.)

Open Access

Year : 2023 | Volume :8 | Issue : 2 | Page : 34-43
By

    Anjali Kesarwani

  1. Shubhangi Saxena

  2. Ananmay Sinha

  3. Ayush Yadav

  4. Vaibhav Panwar

  1. Student, IILM Academy of Higher Learning, Uttar Pradesh, India
  2. Student, IILM Academy of Higher Learning, Uttar Pradesh, India
  3. Student, IILM Academy of Higher Learning, Uttar Pradesh, India
  4. Student, IILM Academy of Higher Learning, Uttar Pradesh, India
  5. Assistant Professor, IILM Academy of Higher Learning, Uttar Pradesh, India

Abstract

A human face is the most influential part of humans that can uniquely identify a person. Using all the facial characteristics as biometric, the LIFO system can be applicable in many different ways. Like in everyday life, the most mandatory task in any organization is attendance marking. Earlier, people used to mark their presence using paperwork but now along with the advancement of technology, this system has also changed and converted into digital form. Talking about this project, face recognition approach has been taken using open CV. In this model, we have integrated a camera that captures an image as an input, an algorithm for detecting a face from an input image, encoding and identifying the face. After that, record will be automatically updated on a CSV file. The system is then trained on the authorized student faces and the dB is created for the trained data. The database for cropped images is then created
along with the associated labels. The features are extracted using Harr Cascade classifier. Not only the attendance system but we can also monitor every behavior of students with the help of their face. Like, analyzing the facial expression, the teachers can conclude if the students understanding their lectures or not. This is done with the help of recently evolved technology i.e. Landmark detector, which is trained with large datasets and exhibits excellent robustness against different angles concerning the camera. It is seen that the precision of eye-opening level are evaluated according to the landmarks. The landmark detector algorithm extracts the scaler quantity eyes-aspect-ratio characterizing the opening of eyes in each frame. Finally, the support vector machine detects the blinking of eyes as a pattern of EAR values and displays it on window. After getting this kind of report, teacher can record the behaviors. Emotion analysis is also going to be part of this project which is very helpful for the teachers. The emotions are sorted into six classes namely anger, joy, surprise, disgust, sadness, and fear from face image datasets.
To carry out the particular operation, a camera in several areas like college/classroom, offices, movie auditorium, and in front of a car will be equipped that can be able to recognize the emotions of people introducing a potent new form of artificial intelligence into education for monitoring children for classroom compliance. Here in this project, there will be no existing data about the previous behavior of the students and works on real-time expression tracking.

Keywords: Advancement of technology, artificial intelligence, LIFO, open CV, support vector machine

[This article belongs to Journal of Advancements in Robotics(joarb)]

How to cite this article: Anjali Kesarwani, Shubhangi Saxena, Ananmay Sinha, Ayush Yadav, Vaibhav Panwar , Live Integrated Facial Observation (L.I.F.O.) joarb 2023; 8:34-43
How to cite this URL: Anjali Kesarwani, Shubhangi Saxena, Ananmay Sinha, Ayush Yadav, Vaibhav Panwar , Live Integrated Facial Observation (L.I.F.O.) joarb 2023 {cited 2023 Jan 30};8:34-43. Available from: https://journals.stmjournals.com/joarb/article=2023/view=97436

Full Text PDF Download

Browse Figures

References

1. AnalyticsIndiaMag (Nov, 2020). A complete guide on building a face attendance system. [Online] Available from: https://analyticsindiamag.com/a-complete-guide-on-building-a-face-attendancesystem/
2. AnalyticsIndiaMag (Jul, 2020). My first CNN project– Emotion detection using convolutional neural network with TPU. [Online] Available from: https://analyticsindiamag. com/my-first-cnn-project-emotion-detection-using-convolutional-neural-network-with-tpu.
3. Chun-Ting Hsu, Wataru Sato, Sakiko Yoshikawa. Enhanced emotional and motor responses to live versus videotaped dynamic facial expressions. Scientific Reports. 2020; 10: 16825. https://www.nature.com/articles/s41598-020-73826-2.
4. MG Frank. International Encyclopedia of the Social & Behavioral Sciences. Facial Expressions. 2001; 5230–5234. https://www.sciencedirect.com/topics/computer-science/facial-expression.
5. Fangbing Qu, Wen-Jing Yan, et al. You should have seen the look on your face…”: Self-awareness of facial expressions. Front Psychol. 2017; 8: 832. https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00832/full.
6. Jose A. Diego-Mas. The Influence of Each Facial Feature on How We Perceive and Interpret Human Faces. https://journals.sagepub.com/doi/full/10.1177/2041669520961123.
7. Chris Frith. Role of facial expressions in social interactions. Philos Trans R Soc Lond B Biol Sci. 2009; 364(1535): 3453–3458.
8. Pyimagesearch. Adrian Rosebrock (May, 2017). Drowsiness detection with Open CV. https://www.pyimagesearch.com/2017/05/08/drowsiness-detection-opencv/.
9. Grant Zhong (Dec, 2019). Drowsiness detection with machine learning. https://towardsdatascience.com/drowsiness-detection-with-machine-learning-765a16ca208a.
10. Jennifer Malsert, Amaya Palama, Edouard Gentaz. Emotional facial perception development in 7, 9- and 11-year-old children: The emergence of a silent eye-tracked emotional other-race effect. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0233008.


Regular Issue Open Access Article
Volume 8
Issue 2
Received July 8, 2021
Accepted September 17, 2021
Published January 30, 2023