Method and apparatus for determining emotional arousal by...

Data processing: speech signal processing – linguistics – language – Speech signal processing – For storage or transmission

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S217000, C704S270000

Reexamination Certificate

active

07606701

ABSTRACT:
An apparatus for determining emotional arousal of a subject by speech analysis, and an associated method. In the method, a speech sample is obtained, the speech sample is pre-processed into silent and active speech segments and the active speech segments are divided into strings of equal length blocks (the blocks having primary speech parameters including pitch and amplitude parameters), a plurality of selected secondary speech parameters indicative of characteristics of equal-pitch are derived, rising-pitch and falling-pitch trends in the strings of blocks, the secondary speech parameters are compared with predefined, subject independent values representing non-emotional speech to generate a processing result indicative of emotional arousal, and the generated processed result is outputted to an output device.

REFERENCES:
patent: 5642466 (1997-06-01), Narayan
patent: 5860064 (1999-01-01), Henton
patent: 5995924 (1999-11-01), Terry
patent: 6101470 (2000-08-01), Eide et al.
patent: 6151571 (2000-11-01), Pertrushin
patent: 6173260 (2001-01-01), Slaney
patent: 6226614 (2001-05-01), Mizuno et al.
patent: 6477491 (2002-11-01), Chandler et al.
patent: 7222075 (2007-05-01), Petrushin
patent: 9520216 (1995-07-01), None
patent: 9931653 (1999-06-01), None
patent: 0116570 (2001-05-01), None
R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S. Kollias, W. Fellenz, and J. Taylor, “Emotion recognition in human-computer interaction”, IEEE Signal Processing Magazine, vol. 18, pp. 32-80, Jan. 2001.
A. Batliner, K. Fischer, R. Huber, J. Spilker, and E. Noth,“Desperately seeking emotions, or: Actors, wizards, and human beings”, in Proceedings of the ISCA Workshop on Speech and Emotion, pp. 195-200, Belfast, Sep. 2000.
S. McGilloway, R. Cowie, E. Douglas-Cowie, S. Gielen, M. Westerdijk and S. Stroeve, “Automatic recognition of emotion from voice: a rough benchmark.” in Proceedings of the ISCA Workshop on Speech and Emotion, pp. 207-212, Belfast, Sep. 2000.
G. Klasmeyer, “An automatic description tool for time contours and long-term average voice features in large emotional speech databases”, in Proceedings of ISCA Workshop on Speech and Emotion, pp. 66-71, Belfast, Sep. 2000.
A. Paeschke, M. Kienast, and W. F. Sendlmeier, “F0-Contours in Emotional Speech”, Proc ICPhS, San Francisco, vol. 2. pp. 929-931, 1999.
A. Paeschke and W. F. Sendlmeier, “Prosodic characteristics of emotional speech: measurements of fundamental frequency movements.” in Proceedings of the ISCA ITRW on Speech and Emotion, Newcastle, pp. 75-80, Belfast, Sep. 2000.
F. Dellaert, T. Polzin, and A. Waibel, “Recognizing emotion in speech”, in H. T. Bunnell andW. ldsardi, editors, Proc. ICSLP, vol. 3, pp. 1970-1973, Philadelphia, Oct. 1996.
L. Devillers, I. Vasilescu and L. Vidrascu, “F0 and pause features analysis for anger and fear detection in real-life spoken dialogs,” Proc. Speech Prosody 2004 Nara, pp. 205-208, 2004.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for determining emotional arousal by... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for determining emotional arousal by..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for determining emotional arousal by... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4081542

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.