The sense of vision, long predominant in robotics, is today reaching its limits. To make robots safer, more efficient and better accepted, researchers and manufacturers are turning to the sense of touch.
Imagine having to strike a match wearing boxing gloves, relying solely on your vision. This is the permanent challenge of current robots: their artificial intelligence is restricted by a body that is not very sensitive. If computer vision has made progress, experts agree on one observation: it reaches a glass ceiling if it is not coupled with other senses.
To become reliable and autonomous, robots must learn to “feel”. Nicolas Lauzier, product manager at automation system designer Robotiq, observes this shift in the field. Just a year ago, the general belief was that AI coupled with vision would be enough. Today, he admits that it is “difficult to carry out tactile tasks just with vision”. Moreover, the touch sensor market, estimated at $18.6 billion in 2025, is expected to explode to reach $66 billion by 2035, according to Future Market Insights.
Why sight is no longer enough
Touch is not just for handling, it is essential for safety. For Abderrahmane Kheddar, research director at the CNRS, the challenge goes beyond the hand: “We must equip the whole body”. Touch is important for operating in confined environments, where a robot must be able to feel where it is leaning to stabilize itself. His team at the Montpellier Computer Science, Robotics and Microelectronics Laboratory (LIRMM) has developed an electronic skin capable of dissociating forces in three dimensions. This technology, made up of a flexible magnetic film and sensors, allows the robot to be able to handle an egg without breaking it.
This sensitivity is also a condition of social acceptance. For Firas Abi-Farraj, the chief technology officer (CTO) of Enchanted Tools, a robot must know the difference between an object and a human. “It’s very important for social acceptance. There is also the question of feeling a human hand,” he emphasizes. The fact that a Miroki robot feels that it is “scratched on the head” changes the interaction.
The double challenge of robotic touch
Recreating this meaning remains a challenge. At Robotiq, we rely on “taxels”, tactile pixels, “arranged in a grid on the fingers of the pliers to measure pressure”. But human touch is richer. Firas Abi-Farraj reminds us that we have two types of sensations: cutaneous contact (skin) and kinesthetic perception (muscles and joints). It is the latter that allows us to know how much force we are exerting, even with gloves. “If I push on something, I feel how much force I am putting on that object,” he explains.
In practice, touch is a material constraint because these sensors remain expensive, often more than 1,000 euros per unit, and not very standardized. “If we look at robot hands, each robot has a different one,” notes Firas Abi-Farraj, a sign that the field is not yet industrially mature.
The factory, then the living room
Enchanted Tools has therefore chosen a pragmatic approach for its Miroki and Miroka robots. They use specific handles to guarantee a 99.9% success rate. “In real life, 80% success means that the robot fails one time in five. No one will accept that at home,” says the CTO. The goal is a general public version around 2030, capable of finer manipulations.
To handle various objects without a dedicated handle, the robot must learn. This learning phase mainly takes place virtually. Thanks to Sim2Real, robots practice millions of times in simulators before transferring this knowledge to the machine. Another, more direct route is the demonstration where we guide the robot’s hand. Abderrahmane Kheddar sees a near future: “Will tomorrow, anyone be able to teach their robot household chores, a bit like guiding a child by the hand?”
The challenge is no longer to make machines more “human”, but simply to allow them to handle an object without breaking it or to brush against an operator without danger. It is only on this condition that robots will finally be able to leave confined environments to integrate more massively our factories, then our living rooms.




