MINDBOT: DESIGN AND IMPLEMENTATION OF A MIND-CONTROLLED EDUCATIONAL ROBOT TOY FOR DISABLED CHILDREN
Sibar J. Khalid a*, Ismael A. Ali a
a CCNP Research Lab, Department of Computer Science, Faculty of Science, Zakho, Kurdistan Region, Iraq - sibar.khalid@stud.uoz.edu.krd - ismael.ali@uoz.edu.krd
Received: 26 Aug., 2023 / Accepted: 22 Oct., 2023 / Published: 1 Jan., 2024. https://doi.org/10.25271/sjuoz.2024.12.1.1195
ABSTRACT:
The mindBot robot is a new educational robot toy that can be controlled by brain signals and voice commands. It was evaluated with children with disabilities as well as healthy children as the potential users. The most significant challenge was the size of the used Emotiv Insight electroencephalogram headset when adjusting it on the children’s’ heads. Despite all the challenges, the mindBot robot is a promising technology that could be fun and educational for disabled children. The 11 participants took 36 minutes to finish all tasks on average. This includes the time they spent setting up the robot for the first time, putting on the headset, learning how to use the robot, and using the main educational features. The System Usability Scale usability score for the robot is 71.13, which is considered to be the score of good. The future stages of improving the mindBot includes adding more mobility capabilities and adding the feature of educational assessment.
KEYWORDS: mindBot, Brain Computer Interface, Robotics, Robot Toy, EEG headset, Emotiv Insight, Raspberry Pi, Human Robot Interaction.
Social interactive robots are becoming more common. The field of robotics studies designing, developing and using robots that can perform tasks like manufacturing, nursing, and even teaching. HRI studies the effective communication interaction among humans and robotics, that can be both local and remote (Georg Graetz and Guy Michaels n.d.; Goodrich and Schultz 2007; Tsardoulias and Mitkas 2017; Vilar 2010). Children with disabilities can benefit from brain-computer interfaces (BCIs) technologies that can let them control their environments using their brain signals (Chau and Fairley 2016; Heuvel et al. 2016; Kinney-Lang, Auyeung, and Escudero 2016). BCI research for children is growing, offering potential benefits, but further advancement is needed to ensure effective applications and implementation. While BCIs can help kids with disabilities, there's a gap between inventors and users. Recent research shows children can learn basic BCI skills, offering hope for improved lives (Kinney-Lang, Auyeung, and Escudero 2016; Te’eni, D., Carey, J., & Zhang, P., (2007).
Robotics involves creating machines, like robots, that can perform tasks by interacting with their surroundings. Robots are guided by software made by programmers, which includes various modules for tasks such as moving or recognizing images. Building advanced robots is hard due to needing a diverse team and set of specialised hardware and software tools (Tsardoulias and Mitkas 2017). Robotic technologies include sensors, actuators, communication methods, and other components. Sensors help robots detect their environment, and communication tools let them share data (LaValle 2006; Luo, Yih, and Su 2002; Siciliano and Khatib 2016; Vermesan et al. 2017). On the other hand, voice recognition aids human-robot interaction further by providing a natural way of communicating with robots (Batth, Nayyar, and Nagpal 2018; Lv, Zhang, and Li 2008). At the core of robotics, machine learning algorithms allow robots to learn and adapt as they are being used by the users (S. Khalid 2021; Mosavi and Varkonyi 2017).
BCI technology is a way for people to control devices using their brain-waves and thoughts (Bahri, Abdulaal, and Buallay 2014; Studio n.d.). The BCI system takes brain signals as input and produces actions as output. The specific actions that the system can perform depend on the algorithms that are used to process the brain signals (S. J. Khalid and Ali 2022), Figure 1. There are two main types of BCI technologies: invasive and non-invasive. Invasive BCIs involve implanting electrodes directly into the brain, while non-invasive BCIs use external sensors to capture brain signals (Abdulwahab, Khleaf, and Jassim 2020; Arichi et al. 2012).
Figure 1: The basic design of a Brain-Computer Interface system (S. J. Khalid and Ali 2022)
EEG is a technique for measuring electrical activity in the brain. It is non-invasive and can be used to track brain activity in real time (YASIN, Pasila, and Lim 2018). EEG has been used in mindBot to receive brain signals and translate it to brain commands.
2.3 Human Robot Interaction
HRI aims to investigate and create social robots that best fit in society by studying how humans and robots can optimally interact with each other. This involves a mix of robotics, human factors, cognitive science, and human-computer interaction, where researchers explore topics such as understanding emotions and designing user behaviour and expectations (Goodrich and Schultz 2007).
2.4 Children and Disability
Children with disabilities face challenges in various ways. Conditions like Cerebral Palsy, Spina Bifida, Spinal Muscular Atrophy (SMA), and Duchenne Muscular Dystrophy affect their physical and cognitive development. These children often need specialised care, therapies, and support (Campbell and Bonnett 1975; CDC 2022; D’Amico et al. 2011; Liptak and Samra 2010; Marchetti et al. 2022). Play therapy is crucial for their learning, social integration, and overall well-being. Play helps them acquire new skills, communicate, and interact with others. For disabled children, robots can offer a unique way to engage in meaningful play and education (Dautenhahn, 2007).
3. Related Works
The mindBot project is developing a new generation of robot toy that can be used for educational purposes. The project conducted a literature review on therapeutic and educational robot toys as well as commercial educational robot toys to understand the state of the art in these areas and to identify the key challenges and limitations that need to be addressed.
PlayROB: The PlayRob device shown in Figure 2 is a therapeutic robot designed particularly for manipulating Lego blocks. Children may select a brick, and guide it to the proper location, and place. It can be controlled by a joystick, keyboard, pointing, and Sip-puff input devices. Also, it has a 3DOF (degrees-of-freedom) and a Cartesian Gripper for grasping. The study concluded that the children enjoyed the game in a trial with three children without disabilities and three children with disabilities, but that mapping the needed input movement to desired robot movement was challenging for users who utilise scanning as a way of choosing input. As a result, this method may be challenging to understand for children who have numerous disabilities (Kronreif, Kornfeld, et al. 2005).
Figure 2: PlayROB robot (Kronreif, Prazak-Aram, et al. 2005)
LEGO Mindstorms Robots: This robot toy kit is special because it lets kids build and program robots. Thus, it is educational and encourages creativity, Figure 3. However, when it comes to the LEGO Mindstorms Robot, it is important to be aware of a couple of considerations as handling complex programs might require some extra effort due to certain programming characteristics. Moreover, the way it communicates with multiple robots within range when in direct-control mode could have some room for improvement. (Klassner and Anderson, 2003).
Figure 3: Lego Mindstorms robot kit (Klassner and Anderson 2003)
● CosmoBot: CosmoBot is designed for both therapy and fun purposes with a unique gesture interface that has potential in engaging kids during treatments, Figure 4. Its target age range of 5 to 12 years may restrict its applicability to a specific age group. Additionally, the system's three modes of operation may limit the range of activities and interactions it can provide (Lathan, Brisben, and Safos, 2005).
Figure 4: CosmoBot robot (Lathan, Brisben, and Safos 2005)
● Thymio: Thymio stands out for its various sensors and actuators, which allow complex designs with its versatility in programming environments. However, Thymio has been known to occasionally experience connectivity problems and can be challenging to understand in terms of programming (Mondada et al. 2017), Figure 5.
Figure 5: Thymio robot (Specifications - Thymio & Aseba n.d.)
● Robota: Robota's humanoid design makes it unique for engaging autistic children with the ability to emulate emotions. The Robota project is a valuable educational tool for children between the ages of 6 and 12. However, it may not be suitable for younger children. Additionally, while the robot offers various modes of control, its programming requires knowledge of ANSI C and Visual C++, which can be a barrier for some users (Billard 2003; Billard et al. 2007), Figure 6.
Figure 6: Robota doll (Billard 2003)
mBot Coding Robot Kit: This commercial kit specifically aims to teach coding and logical thinking to kids with a beginner-friendly approach, Figure 7. It needs batteries for the remote and robot. It has slow driving speed and outside body wiring and electrical components which are accessible for misuse and damage by the young users (Robot Kits for Kids n.d.).
Figure 7: mBot robot kit (Makeblock n.d.)
CogniToys Dino: CogniToys Dino is an educational toy that adapts to children's needs and grows with them. They offer interactive experiences with stories, games, and educational content in areas, Figure 8. CogniToys can be operated through voice commands. However, it is important to observe that there are misunderstandings by the users when it comes to communications and its concerns about associated privacy issues (CogniToys n.d.).
Figure 8: Cognitoys Dino smart toy (CogniToys n.d.)
Dash: Dash is a robot that can perform tasks such as to respond to sound, move around, dance and avoid obstacles. It can be controlled by a mobile app or voice commands for kids aged 6 and above. It has some connectivity issues with different devices, and the battery life is relatively short. Additionally, there may be additional purchases required for accessories, which could add up, Figure 9.
Figure 9: Dash robot (Kolodny, L n.d.)
Fisher-Price Smart Teddy Bear: The Fisher-Price Smart Teddy Bear is a toy that can talk, learn, and play games with children aged 3 to 8. It has voice recognition and can recognize smart cards that come with the toy. However, the bear is battery-operated, which can be hard to replace. Finally, the bear cannot be as engaging for older children (Ihamäki and Heljakka 2018), Figure 10.
Figure 10: Fisher-Price smart teddy bear (Amazon.com: Fisher-Price Smart Bear : Toys & Games n.d.)
Milo: The Milo robot is a humanoid robot designed to connect with individuals who have autism as it helps them focus and learn quickly, Figure 11. It is inspired by the methods of dedicated psychologists who have used real-world techniques and evidence-based therapies. It is designed for kids and teenagers aged 5 to 17 and offers gentle speech, playful interactions, teaching moments, expressive faces and voices, games, and dancing. However, the cost of Milo is high at $6,500 (Gandomi, 2018).
Figure 11:Milo robot (LLC n.d.)
● Leka: Leka is created for children with special needs, serving both as an educational toy and a supportive tool for treatment. It comes with various sensors, lights, and a screen. It reacts to how children interact with it, observing and recording their actions. Leka can be controlled in different ways, such as using Wi-Fi, Bluetooth, an app, a remote control, or manually. This cloud-connected robot is recommended for children aged 3 and up who have disabilities. However, Leka comes at a price of $699 (Leka n.d.), Figure 12.
Figure 12:. Leka robot (Leka n.d.)
● QTrobot: QTrobot, created by LuxAI, is an educational and therapeutic robot that comes in a humanoid shape. This 63 cm tall robot is designed for learning and social interaction. It has a display that can show facial expressions and move its upper body in expressive ways. QTrobot is packed with a 3D camera, microphone array, and an Intel NUC CPU. This robot serves as a real-life social worker in various situations, like helping children with autism in their rehabilitation and care. While QTrobot has great potential, it is important to note that, like other cloud-connected devices, it might have privacy risks (S. J. Khalid and Ali 2022; Vulpe et al. 2021).
Figure 13: QTrobot (QTrobot, an engaging educational robot for children with autism and special needs education n.d.)
● Moxie: Moxie is a special robot made for kids aged 5 to 15 who face challenges with Mental Behavioural Developmental Disorders (MSDs). Parents can control Moxie using a special app. This robot is smart and can understand things like the user's face, voice, objects, and where they are. It even recognizes how the user feels by looking at their expressions and listening to their voice. Moxie is not suitable for physically disabled children (Hurst et al. 2020), Figure 14.
Figure 14: Moxie robot (Hurst et al. 2020)
Sphero: (Golestan, Soleiman, and Moradi 2017) explored using a special robot, called Sphero, to help kids with autism. Sphero is a small robot that can listen to voices and follow commands. It can change colours, dance, and do other cool things. They tested this robot with four kids aged 4 to 7 who have autism. The kids learned to tell the robot what to do using their voices. The study found that the kids got more involved and talked more when playing with the robot. The researchers think Sphero could be helpful for therapy with kids who have autism. Robots like Sphero can create a safe space for learning and practising new skills. They can also boost communication, social skills, and problem-solving for these kids. Figure 15.
Figure 15: The Sphero robot toy (Golestan, Soleiman, and Moradi 2017)
Bee-Bot-Bee: is a fun and educational robot that can be used to teach kids coding. It is easy to control and can be programmed to move forward, backward, left, and right. Bee-Bot-Bee is affordable and suitable for children aged 5-6. However, it has limited motion, no sensors, and is small in size (Di Lieto et al. 2020). Bee-Bot robot is shown in Figure 16.
Figure 16: Bee-Bot robot (Di Lieto et al. 2020)
Buddy: is a robot for therapy and education. It has a touchscreen, sensors, and speakers. It can be controlled by voice, remote, or tablet. Buddy can move, speak, interact, and be programmed. It costs $1500 and is for children 4 and up. Normal and disabled people can use it. Buddy has some drawbacks: it is expensive and complex, and the speech recognition needs improvement (BUDDY PRO - Your Robot User Interaction Solutions for Your Brand Image n.d.). Figure 17 represents the Buddy robot toy.
Figure 17: The Buddy Robot (BUDDY PRO - Your Robot User Interaction Solutions for Your Brand Image n.d)
Furby: is a fun toy that can speak, sing, and dance. It is controlled by buttons and sensors. It has a microphone, speaker, tilt switch, and motors. Furby can learn the user’s voice and respond to commands. It is educational and can be played with by normal and disabled children. However, Furby can be noisy when excited and has limited intelligence and interactivity (The Future Is Furby | M/C Journal n.d.). Figure 18.
Figure 18: Furby Robot (Furby Gets Bluetooth, Reacts to Watching Videos with Kids n.d.)
Table 1: summarises the state of the art investigation of survey research and commercial-based therapeutic and educational robot toys which are both operated using brain signals and other controlling methods. The main finding is that the robot toys examined in the literature were mostly not controlled by brain signals, except for the Thymio robot. The Thymio robot used EEG signals to develop a new system that allows users to select a Thymio robot out of three other Thymio robots by using Steady-State Visually Evoked Potential (SSVEP) and looking at it, then controlling its movements with an infrared remote control. In this context, the mindBot can be considered a promising educational robot toy for children with physical disabilities. It can be controlled by brain signals, which means that children with limited mobility can still interact with the robot and learn from it. Also, it can be controlled using voice-commands to navigate through different features and capabilities of the robot.
4. PROPOSED DESIGN AND IMPLEMENTATION OF MINDBOT
4.1mindBot System Architecture
mindBot is a robot toy that can be controlled by the user's brain signals. It is designed for children with physical disabilities, to help them interact with their environment and learn in a fun and interactive way. The mindBot system has three main components: the brain-computer interface (BCI), the robot body, and the learning activities.
The BCI allows the user to control the robot's
movements with their thoughts. The robot body is designed to be safe and easy
to use for children with disabilities. The learning activities are designed to
be fun and engaging and to help children develop cognitive abilities and
academic skills. Figure 19 depicts the suggested conceptual architecture for
the mindBot robot system.
Figure 19: The conceptual architecture of the mindBot system
Table 1. A Review of the Literature on Therapeutic and Educational Robotic Toys (S. J. Khalid and Ali 2022)
Reference |
Robot Name |
Disability Type/Robot Category |
Control Method |
Technical Specificity |
Capabilities |
Price |
Recommended Age |
Subjects |
Limitations |
PlayROB |
Severe Physical Disability/ Therapeutic |
Joystick, Keyboard, Pointing, Sip-puff input devices. |
● 3DOF (degrees-of-freedom) Cartesian ● Gripper for grasping. |
Handle LEGO bricks. |
Research-Based robot |
5 to 11 years |
3 children not disabled 3 disabled children |
Understanding difficulty for children who have various disabilities. |
|
MINDSTORMS EV3 |
Normal Children/ Educational |
Remote control |
● The programmable control unit (RCX) ● Infrared (IR) transmitter/receiver ● Sensors, Motors, Lego |
● Programming ● Walk ● Communicate ● Play games ● Chores |
$799.99 |
10 - 15 years old kids |
Not Mentioned |
Does not support integrated infrared point-to-point wireless protocols. |
|
CosmoBot |
Developmentally Disability / Therapeutic |
● Gestures ● Speech |
● Mission Control ● Pressure-sensitive buttons, called aFFx Activators ● Microphone ● Gestural sensors |
● Record a sound ● Lifting its arms ● Voice command ● Movement ● Rotate |
Research-Based robot |
5 to 12 years old kids. |
Children with/out impairments |
Only has 3 modes of operation . |
|
Thymio |
Normal Children/ Educational |
Remote controller |
● Sensors and Actuators ● Microphone ● Remote control receiver ● SD-card slot ● Five capacitive buttons |
● Obstacle avoidance ● Line following ● Respond to freefall/shocks |
$162 |
Age or Gender regardless |
Both genders and all-ages |
● Difficult to understand the program. ● Connection problems
|
|
Robota |
Autistic Disability/ Therapeutic |
● Speech ● Vision ● Body imitation |
● 3 superimposed boards ● Sensors ● Speech synthesis (ELAN) ● QuickCam camera |
● Moving eyes ● Speech Synthesising ● Motion tracking |
Research-Based robot |
6 and 12 years old children |
Tested with 7 children with autism |
● Not supporting physically disabled children. ● Age range. ● Requires programming knowledge. |
|
mBot |
Normal Children/ Educational |
● Mobile application ● Remote control ● Button |
● Light sensor, Line follower ● Buzzer and Bluetooth ● Compatible with Makeblock and LEGO Blocks ● 500g weight |
● Coding ● Line following ● Extension Interfaces ● Play music |
$69.99 |
8 and up |
Normal children and adults |
Not including batteries for the remote control or the robot itself. |
|
CogniToys Dino |
Normal Children/ Educational |
Speech commands |
● Wifi-enabled ● IBM/Watson ● Elemental Path's Friendgine |
● Speech Enabled ● Cloud-based |
$60 |
5 or more |
Normal Children |
● Sometimes it is misunderstood. ● Privacy concerns. |
|
(Buy Wonder Workshop - Dash for CAD 219.99 | Toys R Us Canada n.d.) |
Dash Robot |
Normal Children/ Educational |
● Mobile Application ● Speech |
● 12 white LED segments ● LEGO brick connector ● Wheels and Infrared eye ● Bluetooth and Microphone ● Sensors |
● Sound Reaction ● Obstacle avoidance ● Dancing ● Moving around objects, Cloud-based |
$150 |
6 or more |
Normal Children |
● Experienced connectivity difficulties with gadgets. ● Life of the battery. ● Many add-on purchases with the accessories. |
Smart Teddy Bear |
Normal Children/ Educational |
Speech |
● Tiny camera ● Speaker |
● Voice and Image recognition ● Speaking |
$121 |
3 to 8 |
Normal Children |
● Adults help needed. ● Interruptions with voice commands. |
|
Milo |
Autistic Disability/ Therapeutic |
● Mobile application ● Speech ● Facial expression |
● Internal HD camera ● Internal computer ● Microphones ● Touch sensors ● Motion sensors |
● Voice expression ● Face expression ● Speak slowly ● Play games |
$6,500 (for schools) |
5 to 17 |
10 Autistic students |
Expensive |
|
Leka |
Autistic + Developmentally Disablity/ Therapeutic Educational |
● Wifi Bluetooth ● Application ● Remote, Manual control |
● Sensors ● Lights ● LCD screen ● Arduino |
● Predictable ● Adaptive ● Games, music ● Daily tasks ● Move around ● Change colours ● Cloud-based |
$699 |
3 and up |
Children with disabilities |
Not supporting physically disabled children |
|
(QTrobot, an engaging educational robot for children with autism and special needs education n.d.) |
QTrobot |
Autistic Disability/ Therapeutic and Educational |
Wifi Application |
● Intel® Core ™ i5/i7 CPU and RealSence ™ Camera ● 4 High-Perf microphones ● Stereo 2.8W CD speaker ● Up to 32 GB RAM, up to 512 GB SSD ● WLAN, USB-C, USB 3.0, Ethernet & HDMI ports through USB-C adaptor |
● Pose Tracking ● Image, Emotion, Face and Speech Detection and Recognition ● Sound Detection & Localization ● Multi-Lingual |
$1,977 |
4 to 14 |
15 ASD diagnosed children |
Security risks issues. |
Moxie |
Educational, Mental, Behavioral, Developmental Disorders (MSDs)/ Therapeutic |
Parent application |
● Microphones ● Camera ● Sensors |
Face, Object, Location, Emotion and Voice Recognition
|
$1,500 |
5 to 10 |
Children who have ASD disability |
● Not supporting physically disabled children. ● Expensive |
|
Sphero |
Autistic Disability / Therapeutic |
Voice commands / smartphones or tablets |
● Bluetooth ● Microphones ● Speaker |
● Voice recognition ● Changing colour ● Jumping ● Dancing ● Rotating |
Research-based robot |
4 to 7 |
4 children with autism |
Operator presence in sessions |
|
Bee-Bot-Bee |
Educational |
Controlled by the programming button on its back |
● Microcontroller ● Actuators ● Power source ● Communication interface |
● Motion ● Teaching coding |
$50 |
5 to 6 |
Normal children |
● Limited motion ● No sensors ● Small size |
|
(BUDDY PRO - Your Robot User Interaction Solutions for Your Brand Image n.d.) 2020 |
Buddy robot |
Therapeutic / Educational |
● API ● Remote controlling via tablets ● Voice commands ● Touchscreen
|
● 8 inch Touch Screen ● Sensors ● Ultrasonic Distance ● Speaker ● Camera ● Hub ● Charging pads ● On / Off button ● Lithium battery ● Power connector ● Heart LED ● Microphones |
● Movement ● Speech ● Interaction ● Programmability |
$1500 |
4 and up |
Normal and disabled people |
● High cost ● Complexity, especially for young children. ● The speech recognition needs more improvement it may lead to misunderstands |
Furby |
Educational |
Controlled by combination of buttons and sensors |
● Button ● Sensors ● Battery ● Microphone ● Speaker ● Tilt switch ● Motors |
● Voice Recognition ● Speaks ● Sing ● Dance |
$60 |
6 and up |
Normal and disabled children |
● Limited intelligence and interactivity ● Noisy when excited |
|
2023 |
mindBot |
Physically disabled/ Educational |
mind controlled / voice commands |
● Camera ● Microphone ● EEG headset ● LCD screen |
• Mind controlled • Face, voice recognition |
Research based robot |
6 to 10 |
2 children with Spina Bifida disability and 9 normal healthy children |
Standard headset size does not fit all children, due to individual head size variations |
4.2 Mindbody Features
The mindBot robot toy has special abilities controlled by brain signals, like moving in different directions based on facial expressions detected by the Emotive Insight EEG headset. The robot learns to associate expressions with movements through training. It also responds to voice commands, which is helpful for kids with disabilities who might have limited mobility. The robot can recognize speech, transform it into text, and even speak back to the user. It can detect faces, and emotions, and perform educational as:
- Scientific facts.
- Learning ABCs, colours, numbers.
- Video-based and read-aloud stories.
- Games of (1) general knowledge questions (2) riddles (3) secret words.
Overall, mindBot offers a range of features that make it an engaging and educational toy for children, especially those with disabilities. Figure 20 illustrates the various educational features available for children.
Figure 20: The educational features of mindBot
5. MINDBOT IMPLEMENTATION
The visual appearance of the robot is crucial for young users of robotics. Therefore, a special survey was conducted to collect information from children about their preference of robot's face and body, and the results were used to choose the robot's final design.
5.1 Proposed Robot Face and Body
In (Kalegina et al. 2018) reports that people are more likely to perceive robots with facial features as being friendly, intelligent, and trustworthy. This is important for robot designers, as the design of a robot's face can have a significant impact on how people interact with it. In their study, they found that people prefer robot faces with a mouth, circular eyes, and simple features. Robots in the "Human-inspired" category are designed to look like humans, taking inspiration from human body parts like skin, shape, and facial features. They can be creepy if they are too realistic, so it is important to design them with this in mind (Baraka, Alves-Oliveira, and Ribeiro 2020).
In this study, two questionnaires were made about robot faces and bodies with primary school children. In the first one, children saw two different robot faces, and in the second one, they saw four different robot bodies. 91 kids aged 6 to 12 were asked to pick their favourite face and body. Table 2 shows the distribution of student age groups and robot face preferences for two different robots, Figure 21.
Figure 21:. The rounded and square face
Table 2: Distribution of student age groups and robot face preferences
Robot No. |
Student Age Groups |
Total Of 91 |
|||||||
6 |
7 |
8 |
9 |
10 |
11 |
12 |
|||
Robot Face |
1 |
3 |
14 |
9 |
19 |
6 |
2 |
4 |
57 |
2 |
3 |
13 |
7 |
9 |
1 |
1 |
0 |
34 |
Figure 22: The proposed robot bodies
Table 3: Distribution of student age groups and robot body preferences
Robot No. |
Student Age Groups |
Total Of 91 |
|||||||
6 |
7 |
8 |
9 |
10 |
11 |
12 |
|||
Robot Body |
1 |
0 |
2 |
1 |
2 |
0 |
0 |
0 |
5 |
2 |
0 |
0 |
1 |
0 |
0 |
0 |
0 |
1 |
|
3 |
1 |
3 |
0 |
2 |
3 |
0 |
0 |
9 |
|
4 |
4 |
24 |
14 |
21 |
4 |
5 |
4 |
76 |
5.2 3D Design and Modelling of mindBot
Because the printer was small, the robot had to be made in five pieces - head, arms, body, feet, and plate - which were printed separately and then put together using glue. Figure 23 presents the 3D and printed model of mindBot.
Figure 23: 3D and printed model of mindBot
This section discusses detailed user experiments with the mindBot robot, where participants performed tasks like controlling the robot's movement, playing games, and using its voice recognition and other educational activities.
The testing process started with the approval of parents and caregivers for both disabled and healthy children, and the mindBot user manual and headset were given to the participants. About 36 to 52 minutes were spent using the robot to test its functions. 11 primary school students aged between 6 and 11 years old of both sexes, 2 of whom were physically disabled and 9 of whom were healthy, participated in the study, only one of whom had a past robotic experience. It was possible to find and enrol only two disabled children in the city. Because it was socially hard to recruit disabled children from their families and also due to being potential users, the healthy children also have been recruited to test the usability of the mindBot.
Table 4 represents the demography of the participants. The experiments were carried out by ensuring that the children were comfortable at their homes and that the consent form was signed by the parents. Experiments were conducted in one session aimed at minimising emotional stress. Participants were given the freedom to use the robot and the order of use was randomly chosen Table 3.to avoid bias. Figure 24 shows a child using the mindBot robot.
Table 4: Demographics of study participants
Age |
Sex |
Level of Education |
Health Conditions Type of Disability |
Children Previous Robot Experience |
|
P1 |
7 |
Female |
G2 |
Physically Disabled (Spina Bifida) |
No |
P2 |
9 |
Female |
G4 |
Physically Disabled (Spina Bifida) |
No |
P3 |
6 |
Male |
G1 |
Healthy |
Yes |
P4 |
6 |
Female |
G1 |
Healthy |
No |
P5 |
7 |
Female |
G2 |
Healthy |
Yes |
P6 |
8 |
Female |
G3 |
Healthy |
No |
P1 |
8 |
Male |
G3 |
Healthy |
Yes |
P8 |
8 |
Male |
G3 |
Healthy |
No |
P9 |
10 |
Male |
G4 |
Healthy |
Yes |
P10 |
10 |
Female |
G5 |
Healthy |
No |
P11 |
11 |
Female |
G6 |
Healthy |
No |
Figure 24. A child using mindBot
Evaluating the performance of brain-controlled mobile robots is notably difficult, therefore to facilitate comparisons between different robots, it is important to have a standardised performance score that takes into account various factors that can affect performance, such as type of objects, tasks, and environments (Bi, xin’an, and Liu 2013).
Mission Completion Time, also known as Task Completion Time, means the total time spent using the robot for different stages (Zhao et al. 2014). This includes the time to first-time configure the robot, set up the headset, train the user, and test different features of the robot. The EEG headset setup includes creating an Emotive ID, creating a Cortex APP, setting and fitting the headset, adjusting the headset for the best read of brain signals, and setting up and training the Emotive BCI profile. The user is trained on how to use the robot and its features. The learnability of the user is measured by how easy it is for the user to learn how to use the robot. Three tasks are performed to test the mindBot robot:
T1: Drive mindBot assesses the robot's mobility features.
T2: Guess Number assesses the robot's cognitive features.
T3: Science and Good Habits Video Watching with MindBot assesses the robot's ability to engage with children.
The concentration-time and learnability metrics are ergonomic metrics that measure the user's situation rather than efficiency. The workload is an often-used ergonomic measure that quantifies the mental activity required by users while interacting with brain-controlled robot systems.
Table 5:Actual time spent per sub-task completion in minutes
User |
1st Config. Time |
Headset Setup Time |
Training Time |
Test Usage Time |
Total Task Completion Time |
P1 |
4 |
17 |
6 |
12 |
39 |
P2 |
3 |
15 |
5 |
13 |
36 |
P3 |
3 |
14 |
7 |
15 |
39 |
P4 |
4 |
15 |
7 |
15 |
41 |
P5 |
3 |
3 |
6 |
13 |
25 |
P6 |
4 |
10 |
8 |
14 |
36 |
P7 |
3 |
14 |
6 |
12 |
35 |
P8 |
4 |
14 |
8 |
13 |
39 |
P9 |
3 |
15 |
7 |
12 |
37 |
P10 |
3 |
14 |
6 |
14 |
37 |
P11 |
3 |
15 |
5 |
13 |
36 |
Average |
3 minutes |
13 minutes |
6 minutes |
13 minutes |
36 minutes |
Table 5: presents the actual time spent per sub-task completion in minutes. The average first configuration time was 3 minutes. This is the time it took participants to connect the mindBot to and enter the required information. The average headset first-setup time was 13 minutes. This is the time it took participants to put on the headset and adjust it to their comfort. The average training time was 6 minutes. This is the time it took participants to learn how to use the robot's controls and perform the tasks required for the test. The average test usage time was 13 minutes. This is the time it took participants to complete the test tasks.
The total task completion time averaged 36 minutes. This is the sum of the first configuration time, headset setup time, training time, and test usage time. The total task completion time ranged from 25 minutes to 41 minutes. As for user training, the mindBot took an average of 6.63 minutes to be trained by each participant. This confirms that the user interface and the system operation flow designed and implemented in the mindBot are easy to use and learn and emotionally engaging. The headset setup time was also relatively short, averaging 13 minutes to wear and adjust.
Time can show how long children use a robot and how
engaged they are. The SUS questionnaire qualitatively helps us understand if a
robot is easy to use and good for humans. It has 10 questions that users answer
about the robot. People rate statements on a scale from "strongly
agree" to "strongly disagree." This helps measure things like
how easy it is to learn and use the robot. To calculate the SUS raw score, add
up the scores for each of the 10 items. The score for each item is based on how
much the statement is agreed with. For odd-numbered items, the best possible
score is 4, and the worst possible score is 1. For even-numbered items, the
best possible score is 0, and the worst possible score is 5. To calculate the
SUS raw score, add up the scores for each of the 10 items. The score for each
item is based on how much the statement is agreed with. For odd-numbered items,
the best possible score is 4, and the worst possible score is 1. For
even-numbered items, the best possible score is 0, and the worst possible score
is 5. The SUS raw score is then multiplied by 2.5 to get the SUS usability
score (Brooke n.d.). A SUS
score of 68 is considered above average and a SUS score of 71.4 or higher is
considered to be good, while a score of 50 or below is considered to be poor (Bangor 2009; Brooke n.d.;
Chacón, Ponsa, and Angulo 2021). Figure 25
shows the conversion for SUS scores.
Figure 25: The rankings of SUS scores (Chacón, Ponsa, and Angulo 2021)
Based on the literature of robotics and robot toys, it is sufficient to evaluate the usability of the proposed robot using SUS usability scoring metric, as most of the time the purpose and features of the proposed robot is different than the robot being compared with (Barradas et al. 2019; Chacón, Ponsa, and Angulo 2021; ElGibreen et al. 2022; Jafari et al. 2018; Usability Evaluation and User Acceptance of Cobot 2022).
The participants were asked to complete a satisfaction SUS form after they had completed the experience. Table 6 presents the statements and results of the System Usability Scale (SUS) questionnaire administered to 11 children who interacted with mindBot.
Table 6: Reactions to specific statements within the System Usability Scale (SUS)
User |
Q1 |
Q2 |
Q3 |
Q4 |
Q5 |
Q6 |
Q7 |
Q8 |
Q9 |
Q10 |
SUS Raw Score |
SUS Final Score |
P1 |
4 |
3 |
3 |
4 |
4 |
4 |
4 |
1 |
4 |
3 |
26 |
65 |
P2 |
4 |
2 |
3 |
5 |
3 |
3 |
4 |
2 |
4 |
4 |
24 |
60 |
P3 |
3 |
3 |
5 |
3 |
3 |
2 |
4 |
2 |
5 |
4 |
28 |
70 |
P4 |
4 |
2 |
4 |
5 |
4 |
3 |
4 |
2 |
4 |
3 |
27 |
67.5 |
P5 |
4 |
3 |
4 |
3 |
4 |
2 |
4 |
2 |
5 |
2 |
31 |
77.5 |
P6 |
4 |
3 |
4 |
5 |
4 |
2 |
4 |
1 |
4 |
2 |
29 |
72.5 |
P7 |
5 |
2 |
3 |
3 |
5 |
4 |
4 |
2 |
4 |
3 |
29 |
72.5 |
P8 |
4 |
3 |
3 |
4 |
3 |
4 |
4 |
1 |
4 |
3 |
25 |
62.5 |
P9 |
5 |
2 |
4 |
4 |
5 |
3 |
4 |
1 |
5 |
4 |
31 |
77.5 |
P10 |
4 |
1 |
5 |
4 |
4 |
3 |
5 |
2 |
5 |
3 |
32 |
80 |
P11 |
3 |
2 |
4 |
3 |
5 |
2 |
5 |
3 |
4 |
2 |
31 |
77.5 |
Average |
4 |
2 |
3 |
3 |
4 |
2 |
4 |
1 |
4 |
3 |
28.45 |
71.13 |
The average SUS score for the participants mindBot was 71.13, which is considered to be a score of good. This indicates that the participants found the mindBot to be generally usable and emotionally engaging.
Figure 26: Results of the mindBot robot usability in terms of SUS
A score of 42 to 80 is considered to be within the normal range for usability evaluations of robots, with a good score of 71.13. This means that a robot with a SUS score of 71.13 is usable but could be improved (Chan et al. 2022; Chrif et al. 2022; Ranzani et al. 2021).
There are a number of factors that can affect the SUS score of the robot, including the EEG headset. The electrodes on the headset were designed for adults, and they were too large for the children who participated in the usability evaluation. This made it difficult to place the electrodes in the correct location, which affected the quality of the EEG signal. The not high contact quality between the electrodes and the skin resulted in a noisy EEG signal. This made it difficult to identify the waveforms of interest, which slowed down the training process. The long training process and task completion time were due to the time it took to troubleshoot the not high contact quality and to obtain a clean EEG signal.
Additionally, the children who participated in the EEG testing perceived the procedure as somewhat mundane or lacking in engagement. This was likely due to the fact that they were not familiar with the EEG headset or the testing process until they received the needed training sessions.
Figure 27: Total task completion time per participant
The robot took a total of 37.27 minutes to complete all of the tasks in the study.
Figure 27 shows the distribution of task completion times per participant for the robot, with the average task completion time being 37.27 minutes. The longest task took 41 minutes to complete, and the shortest task took 35 minutes to complete.
The tasks that involved controlling the robot with the mind and voice, educational videos, and games took a little extra time to complete because the participants were more engaged with the robot and wanted to explore all of its features. This was evident in the way that they spent more time interacting with the robot, asking questions about how it worked, and trying out different things. The participants also seemed to enjoy interacting with the robot, which may have also contributed to the longer task completion times. The cognitive abilities of the disabled participants were comparable to those of other healthy children, as these children with disabilities only had difficulties in walking or moving. This meant that the absence of cognitive impairment among the disabled participants did not significantly affect the tests performed or the results obtained. Because of that two disabled children are still enough as the size of the sample enlarged using healthy children.
After two or three attempts, most users were able to use the robot more effectively. However, the robot's comprehension and response were affected by noise, internet connection speed, and the user's English speaking and understanding level. Users with a technology background and sufficient English skills found the robot easier to use. The robot was designed to understand and respond to natural language commands in English. Users with a strong understanding of English were able to give more detailed and complex commands, which the robot was able to understand and execute more effectively. Additionally, users with a technology background were more familiar with the types of commands that the robot could understand, which made it easier for them to use the robot effectively.
This study presents the work of the mindBot project, which is the development of a mind-controlled educational robot toy for disabled children. The robot is designed to be controlled by the user's brainwaves, which are captured using an EEG headset, and voice commands as well. This allows children with physical disabilities as well as healthy children to interact with the robot and control its movement, play games, and access educational content without having to use their limbs. The concluded findings of this study are as follows:
●
● mindBot can help children with disabilities to keep having a positive life and being motivated through age-appropriate fun educational learning content.
● The robot's functionality was impacted by internet service speed during testing sessions.
● Task completion times varied, with some tasks taking longer due to the participants' interest in depper exploring the robot's features.
● Disabled participants had similar cognitive abilities to healthy children, indicating that the robot is suitable for use with children of different abilities and even healthy children.
● Users' effectiveness in operating the robot improved with practice, but its comprehension and response were affected by external factors such as noise, internet speed, and the user's English-speaking and understanding skills.
Along with the challenges, the current mindBot version has some future advances for further improvements such as:
● Introducing additional degrees of purpose-based movement freedom for the head, legs, and hands.
● Designing and developing or using a commercial-based child-friendly EEG headset.
● Adding more educational content such as the educational assessment so parents will be aware of the cognitive and educational progress.
● Integrating ChatGPT voice chatting, which would enable the mindBot to engage in more natural age-appropriate interactive conversations
Abdulwahab, Samaa, Hussain Khleaf, and Manal Jassim. 2020. “A Systematic Review of Brain-Computer Interface Based EEG.” Iraqi Journal for Electrical and Electronic Engineering 16(2): 1–10.
Arichi, Tomoki et al. 2012. “Development of BOLD Signal Hemodynamic Responses in the Human Brain.” NeuroImage 63(2): 663–73.
Bahri, Zouhir, Sara Abdulaal, and Mariam Buallay. 2014. “Sub-Band-Power-Based Efficient Brain Computer Interface for Wheelchair Control.” In 2014 World Symposium on Computer Applications Research (WSCAR), , 1–7.
Baraka, Kim, Patrícia Alves-Oliveira, and Tiago Ribeiro. 2020. “An Extended Framework for Characterizing Social Robots.” In Human-Robot Interaction: Evaluation Methods and Their Standardization, Springer Series on Bio- and Neurosystems, eds. Céline Jost et al. Cham: Springer International Publishing, 21–64. https://doi.org/10.1007/978-3-030-42307-0_2 (September 8, 2022).
Barradas, Rolando, José Lencastre, Salviano Soares, and António Valente. 2019. “Usability Evaluation of an Educational Robot for STEM Areas:” In Proceedings of the 11th International Conference on Computer Supported Education, Heraklion, Crete, Greece: SCITEPRESS - Science and Technology Publications, 218–25. http://www.scitepress.org/DigitalLibrary/Link.aspx?doi=10.5220/0007675102180225 (June 23, 2023).
Batth, Ranbir Singh, Anand Nayyar, and Amandeep Nagpal. 2018. “Internet of Robotic Things: Driving Intelligent Robotics of Future - Concept, Architecture, Applications and Technologies.” In 2018 4th International Conference on Computing Sciences (ICCS), , 151–60.
Bi, Luzheng, Fan xin’an, and Yili Liu. 2013. “EEG-Based Brain-Controlled Mobile Robots: A Survey.” Human-Machine Systems, IEEE Transactions on 43: 161–76.
Billard, Aude. 2003. “Robota: Clever Toy and Educational Tool.” Robotics and Autonomous Systems 42(3): 259–69.
Billard, Aude, Ben Robins, Jacqueline Nadel, and Kerstin Dautenhahn. 2007. “Building Robota, a Mini-Humanoid Robot for the Rehabilitation of Children With Autism.” Assistive technology : the official journal of RESNA 19: 37–49.
Brooke, John. “SUS - A Quick and Dirty Usability Scale.”
“BUDDY PRO - Your Robot User Interaction Solutions for Your Brand Image.” BUDDY The Emotional Robot. https://buddytherobot.com/en/buddy-pro/ (October 3, 2023).
Campbell, J, and C Bonnett. 1975. “Spinal Cord Injury in Children.” Clinical orthopaedics and related research (112): 114–23.
CDC. 2022. “Duchenne Muscular Dystrophy Care Considerations | CDC.” Centers for Disease Control and Prevention. https://www.cdc.gov/ncbddd/musculardystrophy/care-considerations.html (December 14, 2022).
Chacón, Alejandro, Pere Ponsa, and Cecilio Angulo. 2021. “Usability Study through a Human-Robot Collaborative Workspace Experience.” Designs 5(2): 35.
Chan, Wesley P. et al. 2022. “Design and Evaluation of an Augmented Reality Head-Mounted Display Interface for Human Robot Teams Collaborating in Physically Shared Manufacturing Tasks.” ACM Transactions on Human-Robot Interaction 11(3): 1–19.
Chau, Tom, and Jillian Fairley. 2016. Paediatric Rehabilitation Engineering: From Disability to Possibility. CRC Press.
Chrif, Farouk et al. 2022. “Usability Evaluation of an Interactive Leg Press Training Robot for Children with Neuromuscular Impairments.” Technology and Health Care 30(5): 1183–97.
“CogniToys: Internet-Connected Smart Toys That Learn and Grow.” Kickstarter. https://www.kickstarter.com/projects/cognitoys/cognitoys-internet-connected-smart-toys-that-learn (October 6, 2021).
D’Amico, Adele, Eugenio Mercuri, Francesco D. Tiziano, and Enrico Bertini. 2011. “Spinal Muscular Atrophy.” Orphanet Journal of Rare Diseases 6(1): 71.
Dautenhahn, Kerstin. 2007. “Socially Intelligent Robots: Dimensions of Human–Robot Interaction.” Philosophical Transactions of the Royal Society B: Biological Sciences 362(1480): 679–704.
Di Lieto, Maria Chiara et al. 2020. “Empowering Executive Functions in 5- and 6-Year-Old Typically Developing Children Through Educational Robotics: An RCT Study.” Frontiers in Psychology 10. https://www.frontiersin.org/articles/10.3389/fpsyg.2019.03084 (October 3, 2023).
ElGibreen, Hebah et al. 2022. “Telepresence Robot System for People with Speech or Mobility Disabilities.” Sensors 22(22): 8746.
“Furby Gets Bluetooth, Reacts to Watching Videos with Kids.” PCMAG. https://www.pcmag.com/news/furby-gets-bluetooth-reacts-to-watching-videos-with-kids (October 7, 2023).
Golestan, Shadan, Pegah Soleiman, and Hadi Moradi. 2017. “Feasibility of Using Sphero in Rehabilitation of Children with Autism in Social and Communication Skills.” In 2017 International Conference on Rehabilitation Robotics (ICORR), London: IEEE, 989–94. https://ieeexplore.ieee.org/document/8009378/ (October 2, 2023).
Goodrich, Michael A., and Alan C. Schultz. 2007. “Human-Robot Interaction: A Survey.” Foundations and Trends® in Human-Computer Interaction 1(3): 203–75.
Heuvel, Renée J. F. van den et al. 2016. “Robots and ICT to Support Play in Children with Severe Physical Disabilities: A Systematic Review.” Disability and Rehabilitation: Assistive Technology 11(2): 103–16.
Hurst, Nikki et al. 2020. “Social and Emotional Skills Training with Embodied Moxie.” arXiv:2004.12962 [cs]. http://arxiv.org/abs/2004.12962 (February 16, 2022).
Jafari, Nooshin et al. 2018. “Usability Testing of a Developed Assistive Robotic System with Virtual Assistance for Individuals with Cerebral Palsy: A Case Study.” Disability and Rehabilitation: Assistive Technology 13(6): 517–22.
Kalegina, Alisa et al. 2018. “Characterizing the Design Space of Rendered Robot Faces.” In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’18, New York, NY, USA: Association for Computing Machinery, 96–104. https://doi.org/10.1145/3171221.3171286 (September 7, 2022).
Khalid, Sibar. 2021. “Internet of Robotic Things: A Review.” Journal of Applied Science and Technology Trends 2(03): 78–90.
Khalid, Sibar Jameel, and Ismael Ali Ali. 2022. “Mind Controlled Educational Robotic Toys for Physically Disabled Children: A Survey.” In 2022 International Conference on Computer Science and Software Engineering (CSASE), , 348–54.
Kinney-Lang, E., B. Auyeung, and J. Escudero. 2016. “Expanding the (Kaleido)Scope: Exploring Current Literature Trends for Translating Electroencephalography (EEG) Based Brain-Computer Interfaces for Motor Rehabilitation in Children.” Journal of Neural Engineering 13(6): 061002.
Klassner, Frank, and Scott D. Anderson. 2003. “MindStorms: Not Just for K-12 Anymore.” IEEE Robotics & Automation Magazine: 12.
Kronreif, Gernot, M. Kornfeld, et al. 2005. 2 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM Playing Assistant for Physical Handicapped Children.
Kronreif, Gernot, Barbara Prazak-Aram, et al. 2005. 28 IEEE 9th International conference on Rehabilitation Robotics PlayROB - Robot-Assisted Playing for Children with Severe Physical Disabilities.
Lathan, Corinna, Amy Brisben, and Charlotte Safos. 2005. “CosmoBot Levels the Playing Field for Disabled Children.” Interactions 12(2): 14–16.
LaValle, Steven M. 2006. Planning Algorithms. Cambridge: Cambridge University Press. https://www.cambridge.org/core/product/identifier/9780511546877/type/book (July 27, 2021).
“Leka.” https://leka.io/ (October 12, 2021).
Liptak, Gregory S., and Ahmad El Samra. 2010. “Optimizing Health Care for Children with Spina Bifida.” Developmental Disabilities Research Reviews 16(1): 66–75.
Luo, R.C., Chih-Chen Yih, and Kuo Lan Su. 2002. “Multisensor Fusion and Integration: Approaches, Applications, and Future Research Directions.” IEEE Sensors Journal 2(2): 107–19.
Lv, Xiaoling, Minglu Zhang, and Hui Li. 2008. “Robot Control Based on Voice Command.” In 2008 IEEE International Conference on Automation and Logistics, , 2490–94.
Marchetti, Antonella, Cinzia Di Dio, Federico Manzi, and Davide Massaro. 2022. “Robotics in Clinical and Developmental Psychology.” Reference module in neuroscience and biobehavioral psychology.
Mondada, Francesco et al. 2017. “Bringing Robotics to Formal Education: The Thymio Open-Source Hardware Robot.” IEEE Robotics Automation Magazine 24(1): 77–85.
Mosavi, Amir, and Annamaria Varkonyi. 2017. “Learning in Robotics.” International Journal of Computer Applications 157(1): 8–11.
“QTrobot, an Engaging Educational Robot for Children with Autism and Special Needs Education.” LuxAI S.A. https://luxai.com/robot-for-teaching-children-with-autism-at-home/ (February 16, 2022).
Ranzani, Raffaele et al. 2021. “Towards a Platform for Robot-Assisted Minimally-Supervised Therapy of Hand Function: Design and Pilot Usability Evaluation.” Frontiers in Bioengineering and Biotechnology 9. https://www.frontiersin.org/articles/10.3389/fbioe.2021.652380 (July 26, 2023).
“Robot Kits for Kids : mBot | Makeblock - Global STEAM Education Solution Provider.” Makeblock. https://www.makeblock.com/steam-kits/mbot (January 11, 2022).
Siciliano, Bruno, and Oussama Khatib. 2016. “Robotics and the Handbook.” In Springer Handbook of Robotics, Springer Handbooks, eds. Bruno Siciliano and Oussama Khatib. Cham: Springer International Publishing, 1–6. https://doi.org/10.1007/978-3-319-32552-1_1 (July 27, 2021).
Studio, Play. “Applications.” Neuralink. https://neuralink.com/applications/ (December 15, 2022).
“Usability Evaluation and User Acceptance of Cobot: Case Study of Universal Robots CB Series.” 2022. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Istanbul, Turkey: IEOM Society International, 1999–2006. https://index.ieomsociety.org/index.cfm/article/view/ID/214 (June 23, 2023).
Vilar, Polona. 2010. “Designing the User Interface: Strategies for Effective Human‐Computer Interaction (5th Edition).” Journal of the American Society for Information Science and Technology 61(5): 1073–74.
Vulpe, Alexandru et al. 2021. “Enabling Security Services in Socially Assistive Robot Scenarios for Healthcare Applications.” Sensors 21(20): 6912.
YASIN, TIMOTHIUS VICTORIO, Felix Pasila, and Resmana Lim. 2018. “A Study of Mobile Robot Control Using EEG Emotiv Epoc Sensor.” MATEC Web of Conferences 164: 01044.1-01044.11. https://www.matec-conferences.org/articles/matecconf/abs/2018/23/matecconf_icesti2018_01044/mateccon (July 26, 2021).
Zhao, Jing et al. 2014. “SSVEP-Based Hierarchical Architecture for Control of a Humanoid Robot with Mind.” In Proceeding of the 11th World Congress on Intelligent Control and Automation, , 2401–6.