System for Prediction of Human Emotions and Depression level with Recommendation of Suitable Therapy

Authors

  • Swati S. Chandurkar Assistant Professor, Pimpri Chinchwad College of Engineering, Pune, India
  • Shailaja V. Pede Assistant Professor, Pimpri Chinchwad College of Engineering, Pune, India
  • Shailesh A. Chandurkar Senior Design Engineer, Sigma Electric Manufacturing Corporation, Pune, India

DOI:

https://doi.org/10.51983/ajcst-2017.6.2.1787

Keywords:

Emotions, Rule Based approach, Speech synthesis, Pitch detection, Depression, Musical therapy, Blood pressure, Voice prosody, EEG signals, decision fusion, optimal weighting, Prediction Algorithm

Abstract

In today’s competitive world, an individual needs to act smartly and take rapid steps to make his place in the competition. The ratio of the youngsters to that of the elder people is comparatively more and also they contribute towards the development of the society. This paper presents the methodology to extract emotion from the text at real time and add the expression to the textual contents during speech synthesis by using Corpus , emotion recognition module etc. Along with the emotions recognition from the human textual data the system will analyze the various human body signals such as blood pressure, EEG signals , vocal prosody to predict the level of depression so that suitable therapy can be suggested using Prediction algorithm. In text analysis, all emotional keywords and emotion modification words are manually defined. To understand the existence of test was carried out on set of textual sentences and preliminary rules written for 34 different emotions. These rules are used in an automated procedure that assigns emotional state values to words. These values are then used by speech synthesizer to add emotions to speech & input sentence. Pitch detection algorithm has been implemented for pitch recognition.

References

"Analysis of Human Body Signals for Prediction of Depression Disorder," in IJCTA, vol. 10, no. 8, pp. 665-672, 2017.

"Healing Hands for Depressed People (D-HH) through Analysis of Human Body Signals to Predict the Level of Depression and Recommendation of Suitable Remedy," in Proceedings of ICCUBEA 2016.

J. Joshi, A. Dhall, R. Goecke, J. F. Cohn, "Relative Body Parts Movement for Automatic Depression Analysis," in Proceedings of IEEE 2013.

L.-S. A. Low, N. C. Maddage, M. Lech, L. Sheeber, N. Allen, "Content Based Clinical Depression Detection in Adolescents," in Proceedings of EUSIPCO 2009.

M. Mahmoud, P. Robinson, "Towards Automatic Analysis of Gestures and Body Expressions in Depression," in Proceedings of EAI International Conference 2016.

E. Moore II, M. Clements, J. Peifert, L. Weissert, "Analysis of Prosodic Speech Variation in Clinical Depression," in Proceedings of IEEE September 2013.

Y. Katyal, S. V. Alur, S. Dwivede, M. R., "EEG Signal and Video Analysis Based Depression Indication," in Proceedings of IEEE 2014.

S. Alghowinem, R. Goecke, M. Wagner, G. Parker, M. Breakspear, "Eye Movement Analysis for Depression Detection," in Proceedings of IEEE 2013.

K. Kipli, A. Z. Kouzani, M. Ioordens, "Computer-aided Detection of Depression from Magnetic Resonance Images," in Proceedings of ICME 2012.

K. E. B. Ooi, M. Lech, N. B. Allen, "Multichannel Weighted Speech Classification System for Prediction of Major Depression in Adolescents," in Proceedings of IEEE February 2013.

Y. Yang, C. Fairbairn, J. F. Cohn, "Detecting Depression Severity from Vocal Prosody," in Proceedings of IEEE 2013.

B. Sun, V. T. Y. Ng, "Analyzing Sentimental Influence of Posts on Social Networks," in Proceedings of IEEE 2014.

V. Knott, C. Mahoney, S. Kennedy, K. Evans, "EEG Power, Frequency, Asymmetry and Coherence in Male Depression," 2010.

A. J. Calder, A. M. Bulter, P. Miller, A. W. Young, "A Principal Component Analysis of Facial Expressions," Vision Research, 2011.

J. Joshi, R. Goecke, S. Alghowinem, A. Dhall, M. Wagner, J. Epps, G. Parker, M. Breakspear, "Multimodal Assistive Technologies for Depression Diagnosis and Monitoring," Journal on Multimodal User Interfaces, 2013.

J. Joshi, R. Goecke, M. Breakspear, G. Parker, "Can Body Expressions Contribute to Automatic Depression Analysis?" 2013.

C. Mathers, T. Boerma, D. M. Fat, "The Global Burden of Disease, 2004 Update," 2004.

https://www.disabledworld.com/artman/publish/bloodpressurechart.shtml

S. D. Bhutekar, M. B. Chandak, "Corpus Based Emotion Extraction To Implement Prosody Feature In Speech Synthesis Systems," IJCA, International Journal of Computer & Electronic Research, vol. 1, no. 2, August 2012, pp. 67-75.

https://kdd.ics.uci.edu/databases/eeg/eeg.html

C. Guinn, R. Hubal, "Extracting Emotional Information from the Text of Spoken Dialog," RTI International, 2009.

Z.-J. Chuang, C.-H. Wu, "Multi-Modal Emotion Recognition from Speech and Text," Computational Linguistics and Chinese Language Processing, vol. 9, no. 2, August 2004, pp. 45-62.

Y. Ying, C. Fairbairn, J. F. Cohn, "Detecting Depression Severity from Vocal Prosody," IEEE Transactions on Affective Computing, vol. 4, no. 2, April-June 2013.

S. Bhutekar, M. Chandak, A. Agrawal, "Emotion Extraction: Machine Learning for Text-Based Emotion," in Proceedings published by International Journal of Computer Applications® (IJCA), MPGI National Multi-Conference 2012 (MPGINMC-2012), 7-8 April, 2012, "Recent Trends in Computing," pp. 20-23.

I. Chazanovitz, M. Greenwald, "Text Based Emotion Estimation," Ben-Gurion University of the Negev Department of Computer Science, September 2008.

X. Zhe, D. John, A. C. Boucouvalas, "Emotion Extraction Engine: Expressive Image Generator," Multimedia Communications Research Group, School of Design, Engineering and Computing, Bournemouth University.

Downloads

Published

07-07-2017

How to Cite

Chandurkar, S. S., Pede, . S. V., & Chandurkar, S. A. (2017). System for Prediction of Human Emotions and Depression level with Recommendation of Suitable Therapy. Asian Journal of Computer Science and Technology, 6(2), 5–12. https://doi.org/10.51983/ajcst-2017.6.2.1787