This paper addresses the multi-faceted problem of robot grasping, where multiple criteria may conflict and differ in importance. We introduce Grasp Ranking and Criteria Evaluation (GRaCE), a novel approach that employs hierarchical rule-based logic and a rank-preserving utility function to optimize grasps based on various criteria such as stability, kinematic constraints, and goal-oriented functionalities. Additionally, we propose GRaCE-OPT, a hybrid optimization strategy that combines gradient-based and gradient-free methods to effectively navigate the complex, non-convex utility function. Experimental results in both simulated and real-world scenarios show that GRaCE requires fewer samples to achieve comparable or superior performance relative to existing methods. The modular architecture of GRaCE allows for easy customization and adaptation to specific application needs.
2023
Patent
Event-driven visual-tactile sensing and learning for robots
Tee, Chee Keong, See, Hian Hian, Lim, Brian, Soh, Soon Hong Harold, Taunyazov, Tasbolat, Weicong, SNG, Kuan, Sheng Yuan Jethro, and Ansari, Abdul Fatir
US Patent App. 18/010,656 Oct 2023
IROS
Refining 6-DoF Grasps with Context-Specific Classifiers
Taunyazov, Tasbolat, Zhang, Heng, Eala, John Patrick, Zhao, Na, and Soh, Harold
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Oct 2023
In this work, we present GraspFlow, a refinement approach for generating context-specific grasps. We formulate the problem of grasp synthesis as a sampling problem: we seek to sample from a context-conditioned probability distribution of successful grasps. However, this target distribution is unknown. As a solution, we devise a discriminator gradient-flow method to evolve grasps obtained from a simpler distribution in a manner that mimics sampling from the desired target distribution. Unlike existing approaches, GraspFlow is modular, allowing grasps that satisfy multiple criteria to be obtained simply by incorporating the relevant discriminators. It is also simple to implement, requiring minimal code given existing auto-differentiation libraries and suitable discriminators. Experiments show that GraspFlow generates stable and executable grasps on a real-world Panda robot for a diverse range of objects. In particular, in 60 trials on 20 different household objects, the first attempted grasp was successful 94% of the time, and 100% grasp success was achieved by the second grasp. Moreover, incorporating a functional discriminator for robot-human handover improved the functional aspect of the grasp by up to 33%.
2021
IROS Best Paper
Extended Tactile Perception: Vibration Sensing through Tools and Grasped Objects
Taunyazov, Tasbolat, Song, Luar Shui, Lim, Eugene, See, Hian Hian, Lee, David, Tee, Benjamin CK, and Soh, Harold
In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Oct 2021
Humans display the remarkable ability to sense the world through tools and other held objects. For example, we are able to pinpoint impact locations on a held rod and tell apart different textures using a rigid probe. In this work, we consider how we can enable robots to have a similar capacity, i.e., to embody tools and extend perception using standard grasped objects. We propose that vibro-tactile sensing using dynamic tactile sensors on the robot fingers, along with machine learning models, enables robots to decipher contact information that is transmitted as vibrations along rigid objects. This paper reports on extensive experiments using the BioTac micro-vibration sensor and a new event dynamic sensor, the NUSkin, capable of multi- taxel sensing at 4 kHz. We demonstrate that fine localization on a held rod is possible using our approach (with errors less than 1 cm on a 20 cm rod). Next, we show that vibro-tactile perception can lead to reasonable grasp stability prediction during object handover, and accurate food identification using a standard fork. We find that multi-taxel vibro-tactile sensing at a sufficiently high sampling rate led to the best performance across the various tasks and objects. Taken together, our results provide both evidence and guidelines for using vibro-tactile perception to extend tactile perception, which we believe will lead to enhanced competency with tools and better physical human-robot interaction.
2020
R:SS
Event-driven visual-tactile sensing and learning for robots
Taunyazov, Tasbolat, Sng, Weicong, See, Hian Hian, Lim, Brian, Kuan, Jethro, Ansari, Abdul Fatir, Tee, Benjamin CK, and Soh, Harold
This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual- Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.
Sensors
An open-source 7-DOF wireless human arm motion-tracking system for use in robotics research
Shintemirov, Almas, Taunyazov, Tasbolat, Omarali, Bukeikhan, Nurbayeva, Aigerim, Kim, Anton, Bukeyev, Askhat, and Rubagotti, Matteo
To extend the choice of inertial motion-tracking systems freely available to researchers and educators, this paper presents an alternative open-source design of a wearable 7-DOF wireless human arm motion-tracking system. Unlike traditional inertial motion-capture systems, the presented system employs a hybrid combination of two inertial measurement units and one potentiometer for tracking a single arm. The sequence of three design phases described in the paper demonstrates how the general concept of a portable human arm motion-tracking system was transformed into an actual prototype, by employing a modular approach with independent wireless data transmission to a control PC for signal processing and visualization. Experimental results, together with an application case study on real-time robot-manipulator teleoperation, confirm the applicability of the developed arm motion-tracking system for facilitating robotics research. The presented arm-tracking system also has potential to be employed in mechatronic system design education and related research activities. The system CAD design models and program codes are publicly available online and can be used by robotics researchers and educators as a design platform to build their own arm-tracking solutions for research and educational purposes.
IROS
Tactilesgnet: A spiking graph neural network for event-based tactile object recognition
Gu, Fuqiang, Sng, Weicong, Taunyazov, Tasbolat, and Soh, Harold
In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Oct 2020
Tactile perception is crucial for a variety of robot tasks including grasping and in-hand manipulation. New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans. These electronic skins respond asynchronously to changes (e.g., in pressure, temperature), and can be laid out irregularly on the robot’s body or end-effector. However, these unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning. In this paper, we propose a novel spiking graph neural network for event-based tactile object recognition. To make use of local connectivity of taxels, we present several methods for organizing the tactile data in a graph structure. Based on the constructed graphs, we develop a spiking graph convolutional network. The event-driven nature of spiking neural network makes it arguably more suitable for processing the event-based data. Experimental results on two tactile datasets show that the proposed method outperforms other state-of-the-art spiking methods, achieving high accuracies of approximately 90% when classifying a variety of different household objects.
IROS
Supervised autoencoder joint learning on heterogeneous tactile sensory data: Improving material classification performance
Gao, Ruihan, Taunyazov, Tasbolat, Lin, Zhiping, and Wu, Yan
In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Oct 2020
The sense of touch is an essential sensing modality for a robot to interact with the environment as it provides rich and multimodal sensory information upon contact. It enriches the perceptual understanding of the environment and closes the loop for action generation. One fundamental area of perception that touch dominates over other sensing modalities, is the understanding of the materials that it interacts with, for example, glass versus plastic. However, unlike the senses of vision and audition which have standardized data format, the format for tactile data is vastly dictated by the sensor manufacturer, which makes it difficult for large-scale learning on data collected from heterogeneous sensors, limiting the usefulness of publicly available tactile datasets. This paper investigates the joint learnability of data collected from two tactile sensors performing a touch sequence on some common materials. We propose a supervised recurrent autoencoder framework to perform joint material classification task to improve the training effectiveness. The framework is implemented and tested on the two sets of tactile data collected in sliding motion on 20 material textures using the iCub RoboSkin tactile sensors and the SynTouch BioTac sensor respectively. Our results show that the learning efficiency and accuracy improve for both datasets through the joint learning as compared to independent dataset training. This suggests the usefulness for large-scale open tactile datasets sharing with different sensors.
IROS
Fast texture classification using tactile neural coding and spiking neural network
Taunyazov, Tasbolat, Chua, Yansong, Gao, Ruihan, Soh, Harold, and Wu, Yan
In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Oct 2020
Touch is arguably the most important sensing modality in physical interactions. However, tactile sensing has been largely under-explored in robotics applications owing to the complexity in making perceptual inferences until the recent advancements in machine learning or deep learning in particular. Touch perception is strongly influenced by both its temporal dimension similar to audition and its spatial dimension similar to vision. While spatial cues can be learned episodically, temporal cues compete against the system’s re- sponse/reaction time to provide accurate inferences. In this paper, we propose a fast tactile-based texture classification framework which makes use of the spiking neural network to learn from the neural coding of the conventional tactile sensor readings. The framework is implemented and tested on two independent tactile datasets collected in sliding motion on 20 material textures. Our results show that the framework is able to make much more accurate inferences ahead of time as compared to that by the state-of-the-art learning approaches.
2019
RA-L presented @ R:SS
Semi-autonomous robot teleoperation with obstacle avoidance via model predictive control
Rubagotti, Matteo, Taunyazov, Tasbolat, Omarali, Bukeikhan, and Shintemirov, Almas
This paper proposes a model predictive control ap- proach for semi-autonomous teleoperation of robot manipulators: the focus is on avoiding obstacles with the whole robot frame, while exploiting predictions of the operator’s motion. The hand pose of the human operator provides the reference for the end effector, and the robot motion is continuously replanned in real time, satisfying several constraints. An experimental case study is described regarding the design and testing of the described framework on a UR5 manipulator: the experimental results con- firm the suitability of the proposed method for semi-autonomous teleoperation, both in terms of performance (tracking capability and constraint satisfaction) and computational complexity (the control law is calculated well within the sampling interval).
ICRA
Towards effective tactile identification of textures using a hybrid touch approach
Taunyazov, Tasbolat, Koh, Hui Fang, Wu, Yan, Cai, Caixia, and Soh, Harold
In International Conference on Robotics and Automation (ICRA) Oct 2019
The sense of touch is arguably the first human sense to develop. Empowering robots with the sense of touch may augment their understanding of interacted objects and the environment beyond standard sensory modalities (e.g., vision). This paper investigates the effect of hybridizing touch and sliding movements for tactile-based texture classification. We develop three machine-learning methods within a framework to discriminate between surface textures; the first two methods use hand-engineered features, whilst the third leverages convo- lutional and recurrent neural network layers to learn feature representations from raw data. To compare these methods, we constructed a dataset comprising tactile data from 23 textures gathered using the iCub platform under a loosely constrained setup, i.e., with nonlinear motion. In line with findings from neuroscience, our experiments show that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data.
2017
IEEE ASME ToM
Constrained orientation control of a spherical parallel manipulator via online convex optimization
This paper introduces a new framework for the closed-loop orientation control of spherical parallel manipulators (SPMs) based on the online solution of a convex optimization problem. The aim of solving a constrained optimization problem is to define a reference position for the SPM that remains as close as possible to the ideal reference (i.e., the one for which the top mobile platform has the desired orientation), at the same time keeping the SPM within the set of configurations in which collisions between links and singular configurations are avoided (the so-called feasible workspace). The proposed approach relies on a recently introduced method for obtaining unique inverse kinematics for SPMs, and on a newly proposed method for generating an approximation of the feasible workspace suitable for fast online optimization. The proposed control scheme is ex- perimentally tested on an Agile Wrist SPM prototype, confirming the performance expected from the theoretical formulation.
HRI late-breaking report
Real-time predictive control of an ur5 robotic arm through human upper limb motion tracking
Omarali, Bukeikhan, Taunyazov, Tasbolat, Bukeyev, Askhat, and Shintemirov, Almas
In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction Oct 2017
This paper reports the authors’ initial results on develop- ing a real-time teleoperation system for an Universal Robots robotic arm through human motion capture with a visualiza- tion utility built on the Blender Game Engine open-source platform. A linear explicit model predictive robot controller (EMPC) is implemented for online generation of optimal robot trajectories matching operator’s wrist position and orientation, whilst adhering to the robot’s constraints. The EMPC proved to be superior to open-loop and naive PID controllers in terms of accuracy and safety.
2016
BioRob
A novel low-cost 4-DOF wireless human arm motion tracker
A human arm can be described as a five degrees- of-freedom (DOF) serial manipulator. The fifth degree - rotation around the forearm axis only contributes to the wrist orienta- tion. Hence, if it is ignored the elbow and wrist joint positions can be tracked using an upper arm orientation and the elbow joint angle. The paper presents a novel low-cost design of a 4- DOF human arm wearable tracker system for wireless dynamic tracking of upper limb position and orientation. The proposed design utilizes a single inertial measurement unit coupled with an Unscented Kalman filter for the upper arm orientation quaternion and a potentiometer sensor for elbow joint angle estimations. The presented arm tracker prototype implements wireless communication with the control PC for sensor data transmission and real-time visualization using a Blender open source 3D computer graphics software and was verified with an Xsens MVN motion tracking system.
2015
SSI
System integration of a solar sensor and a spherical parallel manipulator for a 3-axis solar tracker platform design
Omarali, Bukeikhan, Taunyazov, Tasbolat, Nyetkaliyev, Aibek, and Shintemirov, Almas
In IEEE/SICE International Symposium on System Integration (SII) Oct 2015
This paper presents the authors’ ongoing work on designing a novel 3-axis solar tracker platform utilizing a 3-DOF spherical parallel manipulator with revolute joints and a solar sensor. The selected solar sensor and the SPM configuration are described in details. A novel approach is proposed for the SPM platform orientation estimation based on solar sensor measurements employing trigonometric identities and quaternion rotation representation. The proposed approach is experimentally verified using a SPM 3D-printed prototype equipped with a solar and an orientation sensors. It is assumed that the proposed concept for a novel 3-axis solar tracker platform can further applied to design novel mobile solar tracking systems.