November 18, 2020
Telling humans apart and following them as they move in their surrounding environment could be two highly valuable skills for service robots. In fact, when combined, these two capabilities would allow robots to follow specific people as they are interacting with them or offering their assistance.
Researchers at Monash University, JDQ Systems and University of British Columbia recently developed a service robot designed to assist residents at elderly care homes or patients at other healthcare facilities. In a paper pre-published on arXiv, they presented a computational technique that allows their robot to identify and track people in its vicinity, following specific users as they are assisting them.
“Our team has been developing a socially assistive robot platform, Aether, for providing daily routine assistance to staff and residents at elderly care and assisted living facilities,” Wesley P. Chan, one of the researchers who carried out the study, told TechXplore. “Working with our partnering facilities, we identified a few important skills for our robot, which include escorting a resident to the dining hall for meals, following a staff member to the next location where it is needed or playing a game of tag with residents to encourage exercise. All these skills require our robot to be able to identify and locate the people in its surroundings, as well as follow them.”
Scientists have developed face recognition tools, which allow robots to identify people but not follow their movements, and anonymous person tracking techniques, which allow robots to track a person’s movements without knowing who they are. In order to follow specific people, however, a robot would need to simultaneously determine who they are and track their movements.
To achieve this, Chan and his colleagues combined state-of-the-art face recognition with anonymous person tracking tools. They fused these techniques using what is known as a sequential nearest neighbor algorithm with added thresholding selection, a computational method that can be used to tackle a variety of classification tasks.
“Our algorithm is capable of coping with occlusions, extreme poor/variable lighting and temporary lost target recovery, all of which present great challenges to robots and are often encountered in real-world situations,” Chan said. “The algorithm allows our robot to operate reliably in the real world, for instance, when deployed in our partner care facilities.”
Chan and his colleagues tested their user-following method in a series of experiments in which the Aether robot had to identify, track and follow users within five different scenarios. They monitored the position of the robot and that of people in its surroundings using a motion capture system called Vicon. The initial tests conducted by the researchers yielded highly promising results, with the new technique outperforming the existing face recognition and user-tracking tools it was compared to.
In the future, the tool devised by this team of researchers could be used to enhance the capabilities of both existing and newly developed service robots, allowing them to better assist people in their surroundings. Meanwhile, Chan and his colleagues plan to continue working toward the creation of a robot that can best serve both staff and residents in assisted living or healthcare facilities.
“Continuing our collaboration with our partners, we are now working on further skills for Aether, including navigating safely around people, understanding social gestures people use, performing routine room inspections, carrying out conversations and providing entertainment,” Chan said. “Our greater mission is to enable robots to become socially capable and acceptable partners in society, so that they can help to improve people’s productivity and quality of life.”
Chan et al., Autonomous person-specific following robot. arXiv:2010.08017 [cs.RO]. arxiv.org/abs/2010.08017
© 2020 Science X Network
A robot that can track specific people and follow them around (2020, November 18)
retrieved 7 January 2021
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.