Chethana C Viraktamath,
K. Shashank,
Kshama S.D,
Madan N.R,
Sunil Kumar H.R.,
- Student, Department of Computer Science and Engineering, PES Institute of Technology and Management, Shivamogga, Karnataka, India
- Student, Department of Computer Science and Engineering, PES Institute of Technology and Management, Shivamogga, Karnataka, India
- student, Department of Computer Science and Engineering, PES Institute of Technology and Management, Shivamogga, Karnataka, India
- Student, Department of Computer Science and Engineering, PES Institute of Technology and Management, Shivamogga, Karnataka, India
- Assistant Professor, Department of Computer Science and Engineering, PES Institute of Technology and Management, Shivamogga, Karnataka, India
Abstract
AI-based mock interview platform that acts as a bridge between real interviews and preparation. The system evaluates users based on emotions, confidence, and knowledge. Emotions are analyzed using a deep learning CNN algorithm to classify facial expressions. Confidence is also a main feature in interview which is assessed through speech recognition using natural language processing. Knowledge assessment involves taking online tests and answering along with the description. This platform aims to reduce stress and anxiety before actual interviews while enhancing candidate confidence and performance. This innovative project represents a significant advancement in interview preparation, AI technology, and personal development, with the potential to positively impact individuals’ career success.
Keywords: Emotion and confidence analysis, artificial intelligence, machine learning, CNN, Interview preparation, knowledge assessment.
[This article belongs to Journal of Mechatronics and Automation ]
Chethana C Viraktamath, K. Shashank, Kshama S.D, Madan N.R, Sunil Kumar H.R.. AI Based Mock Interview Evaluator. Journal of Mechatronics and Automation. 2024; 11(02):1-5.
Chethana C Viraktamath, K. Shashank, Kshama S.D, Madan N.R, Sunil Kumar H.R.. AI Based Mock Interview Evaluator. Journal of Mechatronics and Automation. 2024; 11(02):1-5. Available from: https://journals.stmjournals.com/joma/article=2024/view=168783
Browse Figures
References
- S. Nguyen, D. Frauendorfer, M. S. Mast, and D. Gatica-Perez, “Hire me: Computational inference of hirability in employment interviews based on nonverbal behavior,” IEEE transactions on multimedia International Conference.
- Dulmini Yashodha Dissanayake, Venuri Amalya, Raveen Dissanayaka, Lahiru Lakshan, Pradeepa Samarasinghe, Madhuka Nadeeshani, Prasad Samarasinghe, “AI-based Behavioural Analyser for Interviews/Viva “, IEEE 2021.
- S. Sinith, Aswathi E., Deepa T. M., Shameema C. P and Shiny Rajan, “Emotion Recognition from Audio Signals using Support Vector Machine”, December 2015.
- Ming-Hsiang Su, Kun-Yi Huang, Tsung-Hsien Yang, Kuan-Jung Lai and Chung-Hsien Wu, “Dialog State Tracking and Action Selection Using Deep Learning Mechanism for Interview Coaching”, IEEE 2016.
- Chen, R. Zhao, C. W. Leong, B. Lehman, G. Feng and M. E. Hoque, “Automated video interview judgment on a large-sized corpus collected online”, 2017.
- Naim, M. I. Tanveer, D. Gildea and M. E. Hoque, “Automated Analysis and prediction of Job Interview Performance”, IEEE Transactions on Affective Computing, vol. 9, no. 2, pp. 191-204, 2018.
- Dhananjaya and B. Yegnanarayana, Correlation-Based Similarity Between Signals for Speaker Verification with Limited Amount of Speech Data, Chennai 600 036, India: Indian Institute of Technology Madras.
- Burkhardt, A. Paeschke, M. Rolfes, W. Sendlmeier and B. Weiss, “A database of German emotional speech”, Proc. Interspeech, 2005.
- Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: IEEE Conference on Computer Vision and Pattern Recognition, pp 886–893.2005.
- De la Torre, F., Chu, W.S., Xiong, X., Vicente, F., Ding, X., Cohn, J.F.: Intraface. In: IEEE International Conference on Automatic Face and Gesture Recognition, pp 1–8. FG.2015.7163082 (2015).

Journal of Mechatronics and Automation
| Volume | 11 |
| Issue | 02 |
| Received | 06/05/2024 |
| Accepted | 20/05/2024 |
| Published | 23/08/2024 |
PlumX Metrics
