First Integrated Companions

Heriot Watt Showcases

Team Buddy

Video 1

Greet Task: In the first skeleton example video, we show the robot SARAH (Social Agent Robot to Aid Humans) performing the greet behaviour. This example follows from a routine task (greet) for the robot to sense the user, approach the user keeping a comfortable distance from the user and greet the user entering the lab. The competencies developed for this task are Face Detection and User proxemics control. The example demonstrates the integration of the generic software architecture developed for the LIREC project (D.9.3). The 3 layers of the architecture are explained below:

  1. Level3- FAtiMA: Note that in the example FAtiMA (mind) is a standalone version and does not do any reasoning, planning it has only one goal ( To greet the user when the user is seen by the companion)

  2. CMION: Has a set of proxy competencies communicating with SAMGAR modules, blackboard updating the status of competencies running, competency manager rules for activating competencies, knowledgebase to keep account of world model for FAtiMA

  3. SAMGAR: SAMGAR is used to connect CMION and SAMGAR modules namely Face detection, move (for robot motion), camera (video capturing)

Figure 1 shows the flow and working of the greet example across the 3 layers of the architecture.

Video 2

Phone Carry Task: HWU has developed the first prototype working of an important task for the TeamBuddy, the task shown in the video is the phone carrying task. The Teambuddy (SARAH) is alerted by the phone ring (achieved using a light sensor), navigates autonomously (following ceiling patterns, stargazer sensor [1]) towards an assigned table where the user is seated and then stops near the user. The user picks up the phone from SARAH and answers the call.

Video 3

HWU has developed a user proxemic control competence for the robot, autonomous sensing and control of user proxemics based on face detection. This competence is also used in the greet task shown above, the user proxemic sensing and distance calculation is achieved via face detection and face area bounding box in relation to the image captured. The video illustrates how the TeamBuddy (SARAH) tries to adjust the proxemic distance from the user and keeps a threshold distance (80cm-100cm) from the user, following from proxemic studies [2, 3, 4].

References:

[1] http://www.robotshop.com/hagisonic-stargazer-localization-system-3.html

[2] D. S. Syrdal, K. L. Koay, M. L. Walters, K. Dautenhahn, “A Personalised Robot Companion? - The Role of Individual Differences on Spatial Preferences in HRI Scenarios”, Proceedings of the 16th IEEE
International Workshop on Robot and Human Interactive Communication (RO-MAN 2007), Korea, 26-29,(2007).

[3] Koay, K. L., Syrdal, D. S., M. L. . Walters, K. Dautenhahn, “Living with Robots: Investigating the Habituation Effect in Participants Preferences During a Longitudinal Human-Robot Interaction Study”, Proceedings of the 16th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2007), South Korea, 564-569, (2007).

[4] M. L. Walters, K. L. Koay, K. Dautenhahn, R. te Boekhorst D. S. Syrdal, “Human Approach Distances to a Mechanical-Looking Robot with Different Robot Voice Styles”, Proceedings of the 17th IEEEInternational Workshop on Robot and Human Interactive Communication (RO-MAN 2008), Germany, 707-712, (2008).

In the Wild

This video shows the Spirit of the building SARAH, in the “In the wild” scenario. Sarah is displayed as a graphical character on a screen in the crush area of Heriot Watt University’s School of Computer Science, a casual meeting space for students. There she can have conversations with students and provide them with information. This video shows the first autonomous prototype of this system, using face detection to determine whether a user is there and to gaze at the user. Users interact with the system by sending text messages. When the companion sees no user it enters a sleep mode.

In this video the face tracking competency of the companion is shown. A user moves in front of the screen and the character follows the user with its gaze. The user is identified through face detection software.

University of Hertfordshire Showcases

The video demonstrates an example episode of a daily activity that may occur in the robot house. The aim is to demonstrate one of the first integrated skeleton models of the LIREC companion in the UH (University of Hertfordshire) Robot House showcase. The video shows the companion autonomously detecting the user¹s activity (i.e. open the fridge door) and approaches the user to offer physical assistance. The video also illustrates successful migration (abridged) of the companion between the Pioneer and the AIBO robot embodiments. When the companion migrates to the AIBO robot, it acts as a social mediator to enhance social communication between the owner and a distant friend/family member. Competences integrated and illustrated in the video include autonomous navigation, sensing the user¹s location and possible activity, task-oriented migration (abridged) and accepting direct command from the user.

INESC-ID First Integrated Companions – My Friend Showcase

My Friend showcase includes scenarios in which the companion’s behaviour is fully autonomous and endowed with social mechanisms (e.g. affect sensitivity, theory of mind…) that will allow the companion to react and respond to the user in an intelligent and social appropriate manner. By creating simple tasks or games that users and the companion can perform or play together, we will evaluate if social and affective relationships can be established and maintained over time.

Video 1: Migration in the Game Companion Scenario

In this scenario the user is able to play chess with the companion. The match can be played at home using a real chessboard and in this case the companion is a robot. The companion is able to express emotions during the match. The emotion expressed is determined by the companion's expectation of how the match is going to progress (in its perspective) and how it actually has progressed.

While playing against a robotic companion using a real chessboard is an engaging experience, the user might need to go to another place. In this situation the companion migrates to a mobile phone, taking with him its memory of past interactions with the user and the current setup of the chessboard. This way the user is able to resume the chess match that was interrupted, using the mobile phone with a virtual version of the chessboard and the companion. When the user comes back home the companion is able to migrate back to the robot.

Video 2 – iCat’s expressive behaviours

This video shows some of the expressive behaviours the iCat displays while playing chess with the user. The iCat’s affective state is influenced by the state of the chess game, and contains two main parts: emotional reactions and mood [1]. Emotional reactions are triggered after every move played by the user, and despite being of short duration, they are quite explicit. They are based on an anticipatory mechanism named emotivector [2], which returns one (out of nine) affective signal resulting from the mismatch between an expected and a sensed value. On the other hand, mood acts as a background affective state, less intense but always present. It is represented by a valence variable with polarity (positive or negative) and intensity.

Video 3 – iCat observing the game of two players

To evaluate the influence of different empathic behaviours on user's perceptions of the companions, we developed a scenario where the iCat robot observes the game of two humans playing chess, reacting emotionally and commenting their moves. The iCat treats the two players differently: it exhibits empathic behaviours towards one of them - the companion, and behaves in a neutral way towards the other player - the opponent. These behaviours are reflected on the robot's facial expressions and utterances, and were inspired by theoretical work on characteristics of empathic teachers [3].

The results of a preliminary experiment suggest that participants with whom the iCat behaved in an empathic manner considered the robot friendlier. We are currently analysing the results of another experiment conducted with 40 subjects. This video was recorded during one of the evaluations, and contains parts of the interaction between users and the iCat.

References:

[1] Leite, I., Pereira, A., Martinho, C., and Paiva, A. Are Emotional Robots More Fun to Play With? In Proceedings of IEEE RO-MAN 2008 Conference. Munich, Germany, 2008.

[2] Martinho, C. and Paiva, A. Using Anticipation to Create Believable Behaviour. in Proceedings of the 21st National Conference on Artificial Intelligence and the 18th Innovative Applications of Artificial Intelligence Conference. Boston MA, USA: AAAI Press: Stanford, California, USA, (2006), 175-180.

[3] Cooper, B., Brna, P. and Martins, A. Effective affective in intelligent systems - building on evidence of empathy in teaching and learning. In A. Paiva, editor, IWAI, volume 1814 of Lecture Notes in Computer Science, pages 21{34. Springer, 1999.

Wrocław University of Technology

The video shows a simple example of integration of autonomous navigation and migration functionalities implemented on FLASH balancing platform and robot face Samuel. Lucas is working at his computer, while his agent stays at Samuel. The agent manifests its presence on the robot by displaying Lucas photo on the screen. When the agent obtains information about a parcel waiting for Lucas in the institute secretary's office, the agent informs Lucas about this fact. The agent offers Lucas to fetch the parcel, what Lucas accepts. Lucas agent migrates to FLASH and navigates autonomously to the secretary's office. Meanwhile Samuel is not active due to absence of the agent. After obtaining the parcel FLASH autonomously navigates back to Lucas desk. The agent then migrates back to Samuel and informs Lucas that the parcel has arrived. Lucas takes the parcel from FLASH.
Finally, FLASH navigates back to its initial position.

Photos

www.flickr.com