While tactile sensing is widely accepted as an important and useful sensing modality, its use pales in comparison to other sensory modalities like vision and proprioception. AnySkin addresses the critical challenges of versatility, replaceability, and data reusability, which have so far impeded the development of an effective solution.
Building on the simplistic design of ReSkin, and decoupling the sensing electronics from the sensing interface, AnySkin simplifies integration making it as straightforward as putting on a phone case and connecting a charger. Furthermore, AnySkin is the first sensor with cross-instance generalizability of learned manipulation policies.
This work makes three key contributions: first, we introduce a streamlined fabrication process and a design tool for creating an adhesive-free, durable and easily replaceable magnetic tactile sensor; second, we characterize slip detection and policy learning with a AnySkin sensor; and finally, we demonstrate the generalizability of models trained on one instance of AnySkin to new instances, and compare it with popular existing tactile solutions like DIGIT and ReSkin.
We present AnySkin, a skin sensor made for robotic touch that is easy to assemble, compatible with different robot end-effectors and generalizes to new skin instances.