Toward a self-adapting resource-restricted voice-based Classification of Naturalistic Interaction Stages

Affiliation
Otto-von-Guericke-Universität Magdeburg
Weißkirchen, Norman; Böck, Ronald
For the implementation of a user oriented assistance system, or an advanced humancomputer interface, the computerized part needs to be supplied with a wide range of information concerning the current user. The adaption and personalization of the user experience is based not only on the recognition of control commands, which can be solved through current voice controlled interfaces, which are capable of interpreting the syntax of its user, but also on the non-syntactical information conveyed through voice affect or body language. Therefore, current research aims to advance the capabilities of computer assisted systems to recognize emotions and also more general states of a user. This is done to provide for the anticipative and cooperative behavior needed for assisting applications, which then can adapt to the specific needs the user can have concerning their personal state and level of mental involvement. To facilitate this, one needs to provide current classifiers working on highly relevant features suitable for the discrimination of user states. Used features can change considerably between different classiffication tasks which hinders a general set of features. At the same time such systems often need to be mobile applications, which restricts the amount of computational power available, and additionally real-time compatible, which limits the complexity of the used classifier and its applicable feature set, since resource-restricted hardware is used in such application. Our aim is to provide an alternative approach to get similar results to more complex methods, by using simpler architectures with a relatively small dataset. The inspiration were comparable studies concerning the minimization of the used feature set as discussed in this paper.
Zur Startseite

Cite

Citation style:
Could not load citation form.

Rights

Use and reproduction:
All rights reserved