02714nas a2200361 4500008004100000020002200041245006200063210006200125260003200187520158600219653002801805653002001833653002201853653002301875653002401898653003001922653001901952653001201971653003601983653003902019653002102058653002002079653002302099653003502122653001602157653001702173653001102190653001702201100001502218700001402233700001702247856008802264 2008 eng d a978-1-4244-1646-200aAuditory mood detection for social and educational robots0 aAuditory mood detection for social and educational robots aPasadena, CAbIEEEc05/20083 a

Social robots face the fundamental challenge of detecting and adapting their behavior to the current social mood. For example, robots that assist teachers in early education must choose different behaviors depending on whether the children are crying, laughing, sleeping, or singing songs. Interactive robotic applications require perceptual algorithms that both run in real time and are adaptable to the challenging conditions of daily life. This paper explores a novel approach to auditory mood detection which was born out of our experience immersing social robots in classroom environments. We propose a new set of low-level spectral contrast features that extends a class of features which have proven very successful for object recognition in the modern computer vision literature. Features are selected and combined using machine learning approaches so as to make decisions about the ongoing auditory mood. We demonstrate excellent performance on two standard emotional speech databases (the Berlin Emotional Speech [W. Burkhardt et al., 2005], and the ORATOR dataset [H. Quast, 2001]). In addition we establish strong baseline performance for mood detection on a database collected from a social robot immersed in a classroom of 18-24 months old children [J. Movellan er al., 2007]. This approach operates in real time at little computational cost. It has the potential to greatly enhance the effectiveness of social robots in daily life environments.

10aauditory mood detection10aComputer vision10aeducational robot10aEducational robots10aEmotion recognition10aemotional speech database10aface detection10ahearing10ainteractive robotic application10alearning (artificial intelligence)10aMachine Learning10aMood Prototypes10aobject recognition10aRobotics and Automation Robots10asocial mood10asocial robot10aSpeech10aUSA Councils1 aRuvolo, P.1 aFasel, I.1 aMovellan, J. uhttps://rubi.ucsd.edu/content/auditory-mood-detection-social-and-educational-robots01620nas a2200349 4500008004100000020002200041245006600063210006600129260003200195520052200227653001900749653001900768653002800787653003500815653001300850653003900863653001400902653002300916653002400939653001700963653001100980653003900991653002101030653000901051653002501060653001501085653001501100653003001115100001501145700001701160856009301177 2008 eng d a978-1-4244-2661-400aAutomatic cry detection in early childhood education settings0 aAutomatic cry detection in early childhood education settings aMonterey, CAbIEEEc08/20083 a

We present results on applying a novel machine learning approach for learning auditory moods in natural environments [1] to the problem of detecting crying episodes in preschool classrooms. The resulting system achieved levels of performance approaching that of human coders and also significantly outperformed previous approaches to this problem [2].

10aAcoustic noise10aauditory moods10aautomatic cry detection10abehavioural sciences computing10aDeafness10aearly childhood education settings10aeducation10aEducational robots10aEmotion recognition10ahuman coders10aHumans10alearning (artificial intelligence)10aMachine Learning10aMood10apreschool classrooms10aPrototypes10aRobustness10aWorking environment noise1 aRuvolo, P.1 aMovellan, J. uhttps://rubi.ucsd.edu/content/automatic-cry-detection-early-childhood-education-settings02582nas a2200373 4500008004100000020002200041245007500063210006900138260003100207520141200238653002801650653003201678653002201710653001601732653001001748653001401758653002301772653002001795653001601815653002801831653002001859653002801879653002301907653001901930653002801949653002201977653003001999653001102029100001502040700001802055700001502073700001702088856010302105 2008 eng d a978-1-4244-2661-400aBuilding a more effective teaching robot using apprenticeship learning0 aBuilding a more effective teaching robot using apprenticeship le aMonterey, CAbIEEc08/20083 a

What defines good teaching? While attributes such as timing, responsiveness to social cues, and pacing of material clearly play a role, it is difficult to create a comprehensive specification of what it means to be a good teacher. On the other hand, it is relatively easy to obtain examples of expert teaching behavior by observing a real teacher. With this inspiration as our guide, we investigated apprenticeship learning methods [1] that use data recorded from expert teachers as a means of improving the teaching abilities of RUBI, a social robot immersed in a classroom of 18-24 month old children. While this approach has achieved considerable success in mechanical control, such as automated helicopter flight [2], until now there has been little work on applying it to the field of social robotics. This paper explores two particular approaches to apprenticeship learning, and analyzes the models of teaching that each approach learns from the data of the human teacher. Empirical results indicate that the apprenticeship learning paradigm, though still nascent in its use in the social robotics field, holds promise, and that our proposed methods can already extract meaningful teaching models from demonstrations of a human expert.

10aapprenticeship learning10aautomated helicopter flight10aAutomatic control10aData mining10aDelay10aeducation10aEducational robots10aexpert teaching10aHelicopters10aHuman-robot interaction10ahumanoid robots10aHumans Learning systems10amechanical control10arobot teaching10aRobotics and Automation10aRUBI social robot10atime 18 month to 24 month10atiming1 aRuvolo, P.1 aWhitehill, J.1 aVirnes, M.1 aMovellan, J. uhttps://rubi.ucsd.edu/content/building-more-effective-teaching-robot-using-apprenticeship-learning00392nas a2200133 4500008004100000245004000041210003500081100001500116700001700131700001500148700001500163700001700178856006300195 2007 eng d00aThe RUBI Project: A Progress Report0 aRUBI Project A Progress Report1 aTanaka, F.1 aMovellan, J.1 aTaylor, C.1 aRuvolo, P.1 aEckhardt, M. uhttps://rubi.ucsd.edu/content/rubi-project-progress-report