Facebook researchers are building better skin and fingers for softer, more sensitive bots


According to Facebook AI Research, the next generation of bots should be much better at feeling – not emotions, of course, but using the sense of touch. And to advance the ball in this relatively new field of AI and robotics research, the company and its partners have built a new type of electronic skin and fingertip that is inexpensive, durable, and provides basic and reliable tactile sense. to our mechanic friends.

The question of why exactly Facebook is leaning into the skins of robots is obvious enough that AI chief Yann LeCun addressed it preemptively during a media call showcasing the new projects.

Oddly, he recalls, it started with Zuckerberg noting that the company seemed to have no good reason to look into robotics. LeCun seems to have taken this as a challenge and started to study it, but a clear answer has emerged over time: if Facebook were to deliver smart agents – and what self-respecting tech company isn’t? – then these agents must understand the world beyond the output of a camera or a microphone.

The sense of touch is not very good at telling if something is a picture of a cat or a dog, or talking in a room, but whether robots or AI are planning to interact with the real world , they need more than that.

“What we’ve become good at is understanding pixels and appearances,” said FAIR researcher Roberto Calandra, “But understanding the world goes beyond that. We have to move towards a physical understanding of objects to base this on. “

While cameras and microphones are inexpensive, and there are plenty of tools available to effectively process this data, the same can’t be said for touch. Sophisticated pressure sensors are simply not popular consumer products, and therefore all of those that are useful tend to stick around in laboratories and industrial environments.

The DIGIT project is quite old, the principle going back to 2009; we wrote about the MIT project called GelSight in 2014 and then again in 2020 – the company has grown and is now the manufacturing partner of this well-documented approach to touch. Basically you have magnetic particles suspended in a soft gel surface, and a magnetometer underneath can sense the movement of these particles, translating those movements into precise force maps of the pressures causing the movement.

This particular implementation (you can see the fingertips themselves in the image at the top) is improved and quite responsive, as you can see from the detailed pressure maps it is able to create when you tap various items. :

Objects shown above images of signals produced by robotic fingertip.

Image credits: Facebook

ReSkin is a larger version of the same idea, but spread over a larger area. One of the advantages of the GelSight type system is that the hard component – the chip with the magnetometer and logic, etc. – is completely separated from the flexible component, which is only a flexible pad impregnated with magnetic dots. This means the surface can get dirty or scratched and is easily replaced, while the sensitive part can safely hide underneath.

In the case of ReSkin, that means you can hook up a bunch of chips of any shape and put a magnetic elastomer plate on top of it, then integrate the signals and you’ll get tactile feedback from the whole thing. Well … it’s not that simple, since you have to calibrate it and all, but it’s a lot simpler than other artificial skin systems that could work at scales greater than a few square inches.

You can even make little dog shoes out of them, because why not?

Animated image of a dog with pressure sensing pads on his feet and readings from them.

Animated image of a dog with pressure sensing pads on his feet and readings from them.

Animated image of a good dog with pressure sensing pads on his feet and readings from them.

With a pressure sensitive surface like this, robots and other devices can more easily detect the presence of objects and obstacles, without depending, for example, on additional friction from the joint exerting a force in it. this direction. This could make assistive robots much smoother and more responsive to the touch – not that there are many assistive robots out there to begin with. But part of the reason is that they can’t be trusted not to run over things or people because they don’t have a good sense of touch!

Facebook’s job here isn’t to come up with new ideas, but to make an effective approach more accessible and affordable. The software framework will be made public and the devices can be purchased at a relatively low price, so it will be easier for other researchers to get into the field.

Source link


Leave A Reply