17 November 2021 – Assistant Professor Harold Soh and his collaborators have won the Best Paper Award at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2021 in October. IROS is a premier flagship academic conference in robotics and one of the largest international robotics events.
Dr Soh and his collaborators – Computer Science PhD students Tasbolat Taunyazov and Eugene Lim, Computer Science Master’s student Shui Song Luar, SGInnovate research engineer Hian Hian See, research assistant David Lee from NUS Engineering and Institute for Health Technology & Innovation, and Assistant Professor Benjamin Tee from the Department of Materials Science and Engineering at NUS Engineering – won the award for their paper on Extended Tactile Perception: Vibration Sensing through Tools and Grasped Objects.
In the paper, the team proposed a neuromorphic fast tactile sensor for robots, and a machine learning framework for interpreting micro-vibro tactile signals. The team were inspired by how humans can infer information from tools they are using.
“A human, holding a tool such as a fork, can tell if an object like a piece of apple comes into contact with the fork purely from touch (without vision or sound). This is possible because mechanoreceptors in our skin sense micro-vibrations that travel along the tool and our brain makes sense out of the signals. We have demonstrated that similar micro-vibro tactile sensing can be used in robot-human handover and food classification tasks with high accuracy,” explained Taunyazov on behalf of the team.
With Extended Tactile Perception, the goal is to enable robots to experience greater proficiency with tools.
“This has broad applicability in various fields. For example, in medicine, a medical robot with a surgical tool can infer information about a human’s body even with no visual perception during surgery, and a robot in a household kitchen can better handle a ladle or a knife to prepare food,” Taunyazov added.
Said Dr Soh of their award: “The Awards Chair commented that our work was notable in several respects, from the design of the sensor to the thoroughness of the experiments and the insights provided. The idea of tactile perception through tools is relatively underexplored in robotics, but has the potential to change how we enable robots to use tools to help people perform tasks.”
Dr Soh also pointed out that robots are gaining more capabilities and being deployed more in real-world situations alongside humans.
“Many tasks require the use of tools. While we could custom-build attachments for robots, it would be much more natural for them to simply pick up and use the tools that we humans do. I think that’s the key promise of our work: that robots can sense the world through held tools and through this ability, better perform tasks whether at home cooking at meal, or at hospitals performing surgery,” he added.
The team comprises experts from different fields, ranging from material science to embedded systems, and from robot manipulation to machine learning.
“This was a true exercise in teamwork. The team had to overcome numerous hurdles throughout the research. We tested many iterations of the sensors, machine learning methods, and experiment design to get things right. Tasbolat and Shui Song spent many late nights in the lab with our robot. Hian Hian and David spent countless hours fabricating and debugging the tactile sensor. Eugene wrangled with the large event datasets and machine learning methods. Everyone played a part in bringing together the various parts of this work,” recalled Dr Soh.
“NUS Computing has been tremendously supportive by providing us with a unique environment to conduct our research. We would also like to thank our funding agency, the National Robotics Programme,” he added.