GenForce

Training Tactile Sensors to
Learn Force Sensing from Each Other

Zhuo Chen1*, Ni Ou1, Xuyang Zhang1, Zhiyuan Wu1, Yongqiang Zhao1, Yupeng Wang1,
Nathan Lepora2, Lorenzo Jamone3, Jiankang Deng4*, Shan Luo1*

  • 1 King’s College London, London, United Kingdom
  • 2 University of Bristol, Bristol, United Kingdom
  • 3 University College London, London, United Kingdom
  • 4 Imperial College London, London, United Kingdom
  • * Corresponding Authors

Humans achieve stable and dexterous object manipulation by coordinating grasp forces across multiple fingers and palms, facilitated by a unified tactile memory system in the somatosensory cortex. This system encodes and stores tactile experiences across skin regions, enabling the flexible reuse and transfer of touch information. Inspired by this biological capability, we present GenForce, a framework that enables transferable force sensing across tactile sensors in robotic hands. GenForce unifies tactile signals into shared marker representations, analogous to cortical sensory encoding, allowing force prediction models trained on one sensor to be transferred to others without the need for exhaustive force data collection. We demonstrate that GenForce generalizes across both homogeneous sensors with varying configurations and heterogeneous sensors with distinct sensing modalities and material properties. This transferable force sensing is also demonstrated with high performance in robot force control including daily object grasping, slip detection and avoidance. Our results highlight a scalable paradigm for robotic tactile learning, offering new pathways toward adaptable and tactile memory–driven manipulation in unstructured environments.

Background

Background
Robot grasping objects with tactile sensors and force control mimics human actions with sensory receptors. These bio-inspired tactile sensors cannot transfer force data with each other due to differences in sensing principles, structural designs and material properties. Current practice to train force prediction models uses repetitive and costly data collection process for force labels.

Human Tactile Memory

Human tactile memory
In humans, the tactile memory system enables the storage and retrieval of experienced tactile information, such as haptic stimuli, across skin regions on hands. Mechanoreceptors in the skin detect deformation, which is translated into a unified sensory encoding, and transmitted to the somatosensory cortex via peripheral nerves for storage and processing. This human ability to adapt, unify, and transfer tactile sensation offers valuable inspiration for developing transferable tactile sensing in robots.

Bioinspiration

Bioinspiration
Overview of the GenForce model. Tactile sensors produce diverse tactile signals under the same deformation due to differences in sensing principles, structural designs and material properties. GenForce unifies tactile signals into marker representation, enables marker-to-marker translation across various sensors, and achieves high-accuracy force prediction on uncalibrated sensors using data transferred from calibrated sensors.

Architecture

Architecture of GenForce
Marker-to-marker translation (M2M) model. The M2M model uses deformed images from calibrated sensors as input and reference images from uncalibrated sensors as conditions to generate deformed images that mimic the deformation applied to uncalibrated sensors. Spatiotemporal force prediction model takes sequential contact images as input and outputs three-axis forces, enhancing prediction accuracy through a spatiotemporal module.

Marker-to-marker Translation Performance

sim
Marker-to-marker translation in simulated data shown with t-SNE
hetero_m2m
Marker-to-marker translation in heterogeneous sensors shown with images

Force Prediction Performance

homo
Homogeneous Translation Performance compared with ATI nano 17 F/T sensor before and after using GenForce
hetero
Heterogeneous Translation Performance compared with ATI nano 17 F/T sensor before and after using GenForce

Applications

Dynamic
Force prediction with dynamic contact events compared with ATI nano 17 F/T sensor
Grapsing
Daily objects grasping with transferable force sensing and control using GelSight (A-II) and uSkin (3 axis). All forces models used in each sensor are transferred from other sensors.
Slip
Force sensing in robot slip detection and avoidance. (i) Robot arm equipped with TacTip (palm) and uSkin (3 axis). (ii) Sequence of robot grasping an object with force control, slip detection, and avoidance. Real‑time measurement of forces, slip‑detection status, gripper width, and robot status during grasping task. All forces models used in each sensor are transferred from other sensors.

Video icon Supplementary Videos

Marker-to-marker translation
Force Prediction Performance
Dynamic Force Test
Daily object grasping (1)
Daily object grasping (2)
Slip Detection & Avoidance (1)
Slip Detection & Avoidance (2)

Citation

If you find our model helpful, feel free to cite it:

@article{chen2025general,
  title={General Force Sensation for Tactile Robot},
  author={Chen, Zhuo and Ou, Ni and Zhang, Xuyang and Wu, Zhiyuan
          and Zhao, Yongqiang and Wang, Yupeng and Lepora, Nathan
          and Jamone, Lorenzo and Deng, Jiankang and Luo, Shan},
  journal={arXiv preprint arXiv:2503.01058},
  year={2025}
}