Middle Layer Incorporating Software Robot and Mobile Robot

0 downloads 256 Views 384KB Size Report
2006 IEEE Conference on. Systems ... with Embedded robot (Embot) and Mobile robot (Mobot). ... Mobot provides integrated
2006 IEEE Conference on Systems, Man, and Cybernetics October 8-11, 2006, Taipei, Taiwan

Middle Layer Incorporating Software Robot and Mobile Robot Tae-Hun Kim, Seung-Hwan Choi, and Jong-Hwan Kim

Abstract— One of the key components of an ubiquitous robot (Ubibot) is the software robot (Sobot) which can communicate with Embedded robot (Embot) and Mobile robot (Mobot). Sobot is a virtual robot, which has the ability to move to any place or connect to any device through a network in order to overcome spatial limitations. Embot has the capability to sense the surroundings, interpret the context of the environment, and can communicate with Sobot and Mobot. On the other hand, a Mobot provides integrated mobile service, though it has spatial limitations. To incorporate Sobot, Embot, and Mobot reliably as an ubibot, middle layer is needed to arbitrate different protocols among them. This paper focuses on incorporating Sobot and Mobot for providing seamless and context-aware services to human beings. To implement the incorporation of them, the basic concept and structure of the middle layer are presented. The effectiveness of the middle layer for Sobot and Mobot is demonstrated through the real experiments.

I. INTRODUCTION

T

he ubiquitous robot, Ubibot is composed of Sobot, Embot and Mobot. Sobot is a virtual robot, which has the ability to move to any place through the networked devices. Embot is implanted in the environment or upon Mobots. It gathers information from various sensors. Mobot is a mobile robot which can make behaviors in real world using real actuators [1], [2]. Ubibot represents the cutting edge of technology with the advent of the ubiquitous era. To implement the concept of “ubiquitous,” the importance of Sobot is increasing. It can be in any desktop computer, PDA (Personal Digital Assistance) or mobile phone. So human beings can interact with Sobot very easily. Since Sobot is a software-based robot, it is easy to change its graphical appearance according to preferences. Its appearance can thus be made human friendly. Sobot can therefore be used for entertainment, education, psychological treatments and so on. In HRI (Human-Robot Interaction), there are four paradigms, as described in [3]. A robot can be viewed as a tool, cyborg extension, a sociable partner, and an avatar. It can be used as a tool to perform a simple task. It can be regarded as a cyborg extension, which can be a part of human body for disabled people or for enhancing physical ability of human beings. It can be also a partner, which has sociable abilities such as emotions, intelligences and so on. In this

Authors are with EECS Department, KAIST, Guseong-dong, Yuseong-gu, Daejeon-shi, 305-701, Republic of Korea (e-mails: {thkim, shchoi, johkim} @rit.kaist.ac.kr). This work was supported by the Ministry of Information and Communications, Korea, under the Information Technology Research Center (ITRC) Support Program.

1-4244-0100-3/06/$20.00 ©2006 IEEE

paradigm, a robot must be able to interact with people by understanding what people want and by providing social cues for them to understand what its intention is. Lastly, human beings can communicate with people who are in remote site, using a remote robot as an avatar or a software robot representing them as if they are in a remote site. A software robot-based “detect and alert” system called NewsAlert [4] delivers internet alerts to the desktops of managers and executives in the form of a personalized electronic newspaper. Verbots [5] are also avatars with a combination of artificial intelligence, natural language processing, and creativity. It allows users to create an engaging virtual personality and it can talk with users changing the agent’s voice, pitch, and speed. However, since software-based robots are in virtual environment, it has physical limitations to serve human beings in real situations, to interact physically and naturally with them, or to make their own physical behaviors. To overcome these physical limitations, Sobot must be able to use Mobot which has physical body, actuators and sensors. This paper proposes a middle layer for the incorporation of Sobot and Mobot. The middle layer is an interface between Sobot and Mobot. Key components of the middle layer are sensor mapper and behavior mapper. Sensor mapper helps Sobot get physical sensor information from Mobot. Thus Sobot in virtual environment can use physical information. Behavior mapper helps Sobot make physical behavior using Mobot in real environment. Thus Sobot can show physical behaviors and interact physically with human beings and so on in real environment. This paper also presents a control arbiter. Since a user, Sobot, and Mobot itself can control a single body (Mobot) at the same time, arbitration of control commands must be needed. Thus the control arbiter is used for arbitration of control commands from Sobot, Mobot and a user. This paper is organized as follows: Section II presents middle layer. In Section III and IV, sensor mapper and behavior mapper are proposed in detail. In Section V, the reasons why control arbiter is essential are described. In section VI, Sobot gets physical information from Mobot and it makes its internal emotion expressions using Mobot. And also the importance of the control arbiter is demonstrated. Finally, concluding remarks follow in Section VII.

II.

MIDDLE LAYER

Sobot has virtual sensors and makes virtual behaviors in virtual environment. With virtual sensors, it can get virtual

253 247

information on time, temperature, light intensity and so on. It can show behaviors such as dance or walk using virtual motor system in virtual environment. In contrast with Sobot, Mobot has real sensors and shows behaviors in real environment. It obtains real sensor information from real sensors and it executes behaviors through a real motor system.

Sobot in Virtual Environment Virtual Sensor

Virtual Behavior

Middle Layer Sensor Mapper

Behavior Mapper

Mobot in Real Environment Real Sensor

Real Behavior

Fig. 1 Middle layer is located in between Sobot in virtual environment and Mobot in real environment

Middle layer is located in between virtual environment and real environment as shown in Fig. 1. It consists of sensor mapper and behavior mapper. The sensor mapper helps Sobot get real information from real sensors which are attached to Mobot in real environment. The behavior mapper also enables the Sobot to make real behaviors using real actuators of Mobot in real environment. As a result, middle layer enables Sobot to provide physical services to human beings through Mobot. III. SENSOR MAPPER The role of sensor mapper is to map physical sensor information from Mobot to virtual sensor information of Sobot when Sobot moves to Mobot. There are three procedures. A. Sensor registration When Sobot moves to Mobot, Sobot registers its virtual sensor information to sensor mapper and also Mobot registers its real sensor information onto sensor mapper. The data structure of sensor information is shown in Fig. 2. Each item is described in the following.

use them. Even though the same kind of sensors are attached to a Mobot, they should be also identified. Thus ID numbers are assigned to all the sensors. 2) Type: This is for defining basic function of sensors. This paper uses ‘DISTANCE’ and ‘VISION’ sensor types. The sensor whose type is ‘DISTANCE’ measures distance between a robot and an obstacle. The sensor whose type is ‘VISION’ makes facial recognition. A robot can recognize its master with this vision sensor. The sensor type can be extended such as ‘TEMPERATURE’, ‘PRESSURE’ and so on. 3) Location: Some sensor information from sensors such as temperature sensor is not seriously dependent on the location where it is mounted because the variation of temperature within the robot’s vicinity is small. But some sensor information from sensors such as ultrasonic sensor is seriously dependent on the sensor location. For example, if the location of ultrasonic sensor is in front of a robot and the measured value from that sensor is 10 m, obstacles might be 10 m away in front of the robot. But if the sensor is in rear of a robot and the robot gets the same data, obstacles can be 10 m away in rear of the robot. So the location of sensor is very important. 4) Value: This is the physically measured value from sensors. The units are in MKS. B. Sensor mapping Once the sensor mapper obtains the sensor information of Sobot and Mobot, it starts to map physical sensor information onto virtual sensor information. The sensor mapper has two comparators. One is a type comparator, and the other is a location comparator as shown in Fig. 3. The inputs of type comparator are one of virtual sensor type and one of physical sensor type. If both of sensor types are not matched, the type comparison is terminated and the type comparator starts to compare another pair of sensor types. But if both of sensor types are matched, the output of type comparator is ‘True’, then this is used for validation of values in mapping table (TABLE 1). And in sequence the type comparator activates the location comparator. Its inputs are virtual and physical sensor locations. If both of sensor locations are matched, the output of the location comparator is ‘True’ and if not, ‘False’ is the output. In this way, sensor mapper compares virtual sensors and physical sensors in sequence. Once the sensor mapper

Fig. 2 Data structure of sensor information

1) ID: As Sobot and Mobot have various sensors, their sensors should be identified so that the sensor mapper may

254 248

Registration ID 1

. .

Type Comparator

Type

Location

Value DISTANCE R : 10, Angle : 45 . . . . . . . Registered Virtual Sensors

. 2

.

.

DISTANCE

R : 10, Angle : 45

.

.

.

. . .

True or False

Physical Sensor ID . 2 . .

Mapped Virtual Sensor ID

. . 1 True . . . . Sensor Mapping Table

. True . .

EMOTION

Happy

. .

. .

Virtual Behavior

Mapped Physical Behavior

. EMOTION . .

. EMOTION . . Behavior Mapping Table

Mobot Physical Sensors

Sensor Mapper

Happy

Virtual Behaviors

Sobot

Valid Location

True or False

EMOTION

Registered Virtual Behaviors

Registered Physical Behaviors

Virtual Sensors

Valid Type

Information . .

Behavior Comparator

Registered Physical Sensors

Location Comparator

Behavior . .

The Flow of Physical Sensor Information

Physical Behaviors

Behavior Mapper

The Flow of Virtual Behavior Command

Fig. 3 Sensor Mapper and Behavior Mapper

completes the all possible comparisons, a mapping table is created. The mapping table (TABLE 1) comprises of three parts. TABLE I SENSOR MAPPING TABLE

Physical sensor ID 1 2 3

Mapped virtual sensor ID 3 1 None

Valid type True True False

Valid location True False False

1) Mapped virtual sensor ID: This is a mapped virtual sensor ID related to a physical sensor ID. The entry may also be “None” since there is no virtual sensor whose function is the same as function of the physical sensor. In this case, Sobot can not get physical information from that physical sensor. 2) Valid value: This value is at least true if the output of type comparator is true. If this value is true, Sobot can get physical information from that sensor. 3) Valid location: This value is true if the output of location comparator is true. This means that the type and the location of physical sensor are the same as the type and the location of virtual sensor. Then Sobot can get physical sensor information. C. Sensor data transfer After sensor mapper creates the mapping table, the mapping table is not updated anymore. It means sensor mapper does not make any comparison and instead just looks up the mapping table when it needs to transfer physical sensor information to Sobot.

If there is a mapped virtual sensor ID and value of valid type is ‘True’ and also value of valid location is ‘True’, sensor mapper transfers the measured value and the location from physical sensor to the mapped virtual sensor. These happen as soon as one of physical sensors makes a new measurement. In this way, Sobot can get real information from real environment. However, in the mapping table, if there is a mapped virtual sensor ID and value of valid type is ‘True’ but value of valid location is ‘False’, it transfers only the measured value from the physical sensor to Sobot and does not transfer location of physical sensor. Instead it transfers ‘NULL’ as the location information. Because the wrong location information makes physical sensor information wrong and the wrong information also makes a serious problem. In this case, Sobot should use only sensors whose locations are not important such as a temperature sensor or a gas sensor. IV. BEHAVIOR MAPPER Sobot can dance and move in a virtual environment. This is seen visibly on the computer monitor. So human beings can see only graphical behaviors. It means that Sobot has physical limitations to provide physical service and make its own physical behaviors. To overcome this limitation, Sobot has to use Mobot which has physical motor system. The role of behavior mapper is to map virtual behavior to physical behavior which Mobot can make. The type of Sobot can be different from the type of Mobot because Sobot can move to various types of Mobot. Behaviors of Mobot are dependent on Mobot and its hardware configuration. Behaviors of Sobot and Mobot can not be exactly same. And also it is impossible for Sobot to

255 249

control every actuator in Mobot. To solve this problem, Mobot must have abstract modular behaviors. That is, each behavior module independently controls actuators which are needed to make that behavior using its own control methods as shown in Fig. 4. Thus Sobot does not have to know about every actuator which is attached to Mobot. It only needs to know about behavior module interface. So this paper assumes the internal mechanism of behavior module of Mobot is abstracted and strongly dependent on the developer of Mobot.

B. Behavior mapping Behavior mapper uses registered data structure of behavior information to map virtual behaviors to physical behaviors. One of key components of behavior mapper is behavior comparator as shown in Fig. 3. It takes two inputs. One is virtual Behavior from Sobot and the other is physical Behavior from Mobot. It compares two Behaviors from Sobot and Mobot. If two behaviors are the same, behavior mapper maps the virtual Behavior onto the physical Behavior in behavior mapping table. Once the behavior comparator completes the all possible comparisons, a behavior mapping table is created (TABLE II). TABLE II BEHAVIOR MAPPING TABLE

Virtual Behavior EMOTION WANDERING GO FORWARD

Fig. 4 Sobot Behaviors and Abstracted Mobot Behaviors

A. Behavior registration Behavior mapper must have virtual and physical behavior information from Sobot and Mobot to map virtual behaviors to physical behaviors. Therefore when Sobot moves to Mobot, both robots should register their behavior information to the behavior mapper. Thus the data structure of behavior information is needed for registration. Behavior

C. Behavior information transfer Whenever Sobot want to make a virtual behavior in real environment, it sends a command to behavior mapper. Then behavior mapper looks behavior mapping table up. If there is a mapped physical behavior related to the virtual behavior, behavior mapper transfers Information to physical behavior module which Mobot has. Then Mobot makes a physical behavior using the behavior module. The behavior module has its own kinematics and control methods for that behavior. In this way, Sobot can realize its virtual behaviors as physical behaviors through Mobot in real environment.

Information

V. CONTROL ARBITER

(a) Basic Data Structure of Behavior ‘GO FORWARD’

‘DIRECTION: 45, DISTANCE: 10, SPEED: 0.2’

(b) ‘GO FORWARD’ Behavior ‘EMOTION’

Mapped Physical Behavior EMOTION WANDERING GO FORWARD

‘Happy’

(c) ‘EMOTION’ Behavior Fig. 5 Data structure of behavior information

The data structure of behavior information consists of Behavior and Information as shown in Fig. 5 (a). 1) Behavior: This is the name of behavior module. This is a string such as ‘GO FORWARD’, ‘GO BACKWARD’, ‘EMOTION’ and so on. Since this is used for comparison, this must be pre-defined. 2) Information: This contains the information which behavior module uses. For example, in Fig. 5 (b), ‘GO FORWARD’ behavior has information about the direction which Mobot have to face, the distance which Mobot have to go in the direction, and the speed which Mobot have to keep. And in Fig. 5 (c), ‘EMOTION’ behavior has information about what emotion should to be made. As described earlier, since Mobot’s behavior modules are abstracted and they are dependent on Mobot, Sobot can not know how Mobot make an emotion expression in real world.

In real environment, the real behavior can be made only by the single hardware body of Mobot. Since Sobot, Mobot and the user try to control the single hardware body for their own purposes, there can be control collision among them. Thus arbitration of control is needed to avoid collision. This paper also presents a control arbiter to solve the control collision. Control arbiter arbitrates controls from Sobot, Mobot and a user with priority-based arbitration as shown in Fig. 6. In this paper, the priority of Mobot control is assumed as the highest. Because the owner of the single hardware is Mobot and Mobot has to protect itself from dangerous situation such as collision, falling down and so on. To enable this kind of protection, the reactive system is generally adopted to control Mobot since the response time is very short and also since it is very robust [7]. The priority of user control is the second highest. The user can control Mobot for his/her own purpose. For example, he/she might want to control Mobot to get remote images from USB camera which is attached to Mobot. Finally the priority of Sobot control is the lowest because basically Sobot makes behaviors for users. If the priority of user control is lower than that of Sobot control, the user can not be satisfied with Sobot.

256 250

structures used for mapping. In this paper, the vertical locations of sensors are not considered. And this paper

Fig. 6 Control arbiter arbitrates commands from Sobot, Mobot, and a User.

Fig. 8 Sensors and their IDs are distributed.

Based on this priority, control arbiter can arbitrate controls from Sobot, Mobot and a user without any control collision. If control commands from Sobot, Mobot and a user are invoked at the same time, control command whose priority is higher suppresses control command whose priority is lower. In this way, only one control command is delivered to behavior module at a time.

assumes the locations of virtual sensors are not important but the directions of virtual sensors are important to Rity. In this experiment, the direction of virtual sensor is the angle in location entry. Thus in this paper, sensor mapper uses only angles for location comparison. TABLE III DATA STRUCTURES OF MYBOT

ID 1 2 3 4 5 6

VI. EXPERIMENT These kinds of experiments are carried out to demonstrate the effectiveness of the proposed scheme. One is about sensor mapper and another is about behavior mapper and the other is about control arbiter. A Sobot called Rity, Fig. 7 (a) and a Mobot called Mybot, Fig. 7 (b), are used for these experiments. Rity is a dog-type pet robot in virtual environment. It has 4 legs, 6 kinds of virtual sensors. It can make 77 virtual behaviors and can move among networked devices [8], [9].

(a) Rity (b) Mybot Fig. 7 Sobot called Rity and Mobot called Mybot

Mybot is a wheeled-type mobile robot. It has 6 ultrasonic sensors and 1 vision sensor. It is driven on a differential drive platform and it has 2 arms to express emotion. It also has 5 behavior modules for 5 emotions.

Type DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE

Location Radius:0.15, Angle : 180 Radius:0.15, Angle : 135 Radius:0.15, Angle : 90 Radius:0.15, Angle : 45 Radius:0.15, Angle : 0 Radius:0.15, Angle : 270

Value • • • • • •

TABLE IV DATA STRUCTURES OF RITY

ID 1 2 3 4 5 6 7 8

Type DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE DISTANCE

Location Radius:0.15, Angle : 180 Radius:0.15, Angle : 135 Radius:0.15, Angle : 90 Radius:0.15, Angle : 45 Radius:0.15, Angle : 0 Radius:0.15, Angle : 315 Radius:0.15, Angle : 270 Radius:0.15, Angle : 225

Value • • • • • • • •

In Table V, there is the mapping table created by sensor mapper.

A. Sensor mapper In this experiment, Rity can get physical information from real sensors which are attached to Mybot. After Rity moves to Mybot, both Rity and Mybot register their sensor information to sensor mapper and sensor mapper starts to create the mapping table. The distributions of physical sensor ID and virtual sensor ID are shown in Fig. 8. Sensors 1 to 6 are ultrasonic sensors. In Table III and Table IV, there are data

257 251

TABLE V MAPPING TABLE IN EXPERIMENT

Physical Sensor ID 1 2 3 4 5 6

Mapped virtual Sensor ID 1 2 3 4 5 7

Valid Type True True True True True True

Valid Location True True True True True True

1

2

3

4

(c) Sad Behavior Fig. 10 Rity shows emotion expression behaviors in real world.

Fig. 10 shows only 3 emotions which are neutral, happy and sad. Mybot is expressing emotion of Rity using its two arms in real world. That is, Rity is expressing its emotion using Mybot in real world.

: :

The Location of Real Obstacle The Location of Ultrasonic Sensor

Fig. 9 Rity is getting physical information from real sensors.

In Fig. 9, Rity is getting physical information from real ultrasonic sensors and virtual environment is displaying related obstacles based on physical information. B. Behavior mapper This paper presents experiments on how Rity makes emotion expressions in real world using Mybot. Rity has 5 emotions such as happiness, sadness, neutral ness, fear, and anger. Mybot also has 5 behavior modules for each 5 emotions. Behavior mapper maps virtual emotion expressions to real behavior modules. Table VI shows the data structures used for behavior mapping. Table VII shows behavior mapping table created by behavior mapper. TABLE VI DATA STRUCTURES FOR BEHAVIOR MAPPER OF RITY AND MYBOT

Behavior EMOTION EMOTION EMOTION

Information Neutral Happy Sad

TABLE VII BEHAVIOR MAPPING TABLE

Virtual Behavior Mapped Physical Behavior EMOTION EMOTION Whenever Rity wants to make emotion expressions, it sends a command to behavior mapper. Then behavior mapper looks up behavior mapper table. If there is a mapped physical behavior in the behavior mapping table, behavior mapper provides the physical behavior module with information about what expression should be made. And physical behavior module makes an emotion expression related to information.

C. Control arbiter In this experiment, the role of control arbiter is demonstrated. The important role is to arbitrate control commands from Rity, Mybot and a user. In Fig. 11 (a), control arbiter is not activated and a user wanted Mybot to go forward even though there is an obstacle in front of it. A few seconds later, Mybot activates a collision avoidance behavior module to avoid the obstacle. Then there become to be two control commands to make a single hardware robot to move. Control commands collide with each other and there is no certainty which control commands are delivered to motor system. At the result, Mybot continues to move forward and it collides with the obstacle. In Fig. 11 (b), the situation is the same as the previous except control arbiter is activated. But if the distance between Mybot and the obstacle is less than limit distance for safety, Mybot invokes collision avoidance command to avoid collision. At this instant, two control commands become to be active with their own objectives but they are arbitrated by control arbiter based on their priorities. Since the priority of Mybot control is higher than the priority of user control, Mybot control command suppresses the user control command. Thus invoked control is Mybot control and it is delivered to motor system. As a result, Mybot can avoid the obstacle.

1

2

3

4

(a) Control arbiter is not activated.

1

2

3

4

(b) Control arbiter is activated. Fig. 11 Demonstrations with and without control arbiter

1

2

3

4

VII. CONCLUSION

(a) Neutral Behavior

1

2

3

4

Sobots which are key components of Ubibot are becoming common and their number is increasing. They have various appearances and their roles are also various such as entertainment, education and so on. They can do many things for human beings. Unfortunately their common feature is that

(b) Happy Behavior

258 252

they exist and make behaviors in virtual environment. In other words, they do not have physical bodies and they can not make behaviors in real environment. Thus they have physical limitations to provide physical services with human beings. And they can not make physical interaction with human beings. Thus as social and situated robots, they can not perform their social roles or duties for human beings and for themselves. To overcome this problem, this paper presents a middle layer for incorporating Sobot and Mobot when Sobot moves onto Mobot. The presented middle layer consists of sensor mapper and behavior mapper. Sensor mapper helps Sobot get physical information from real sensors and behavior mapper helps Sobot realize its virtual behaviors as physical ones through Mobot in real world. This paper demonstrated the role of middle layer with experiments and showed that middle layer properly worked for incorporating Sobot and Mobot. When the middle layer is applied to Sobot and Mobot, the commands between Sobot and Mobot can collide with each other. And since a user can control Mobot, another control command results in that situation. Thus collision of control commands happens and this causes serious problems. Therefore this paper also presents control arbiter to prevent this collision and demonstrated the importance of control arbiter with experiments. In the coming ubiquitous era, Ubibots will be around human beings and most of interactions will be made through Sobot because of its friendliness and high accessibility. Thus Sobot requires the functionality to provide physical services to human beings. Hence Sobot should be able to control Mobot to provide human beings with physical services. To do this, the presented middle layer will play an important role as an interface between Sobot in virtual environment and Mobot in real environment. REFERENCES [1]

[2]

[3]

[4]

[5] [6] [7]

[8]

Jong-Hwan Kim, Yong-Duk Kim, and Kang-Hee Lee, “The Third Generation of Robotics: Ubiquitous Robot,” in Proceedings of the International Conference on Autonomous Robots and Agents, Palmerston North, New Zealand, Dec. 2004 (Keynote Speech Paper). Jong-Hwan Kim, “Ubiquitous Robot,” in Proceedings of the 8th Fuzzy Days International Conference, Dortmund, Germany, Sep. 2004 (Keynote Speech Paper) and “Computational Intelligence, Theory and Applications (edited by Bernd Reusch),” Springer, pp. 451-459, 2004. C. Breazeal, “Social Interactions in HRI: The Robot View,” IEEE Trans. on Systems, Man, and Cybernetics, Part C, vol.34, no.2, pp. 181-186, May, 2004. David King and Kirk Jones, “Competitive intelligence, software robots and the Internet: the NewsAlert prototype,” in Proceedings of the 28th Annual Hawaii International Conference on System Sciences, pp.624-631 vol 4, 1995. Verbot, “http://www.verbots.com”. Ronald C. Arkin, “Behavior-Based Robotics,” MIT Press, pp.205-234, 1998. Y.-D. Kim, and J.-H. Kim, “Implementation of Artificial Creature based on Interactive Learning,” in Proceedings of the FIRA Robot World Congress, pp. 369-374, 2002. Y.-D. Kim, Y.-J. Kim, and J.-H. Kim, “Behavior Selection and Learning for Synthetic Character,” in Proceedings of the IEEE Congress on Evolutionary Computation, pp. 898-903, 2004.

259 253