This artificial skin could give ‘human-like’ sensitivity to robots

Date:

Share:


Robots are getting better at seeing, hearing, and moving, but touch has always been the missing piece. At CES 2026, Ensuring Technology showcased a new kind of artificial skin that could finally give robots something close to human-like sensitivity, helping them feel the world instead of just bumping into it.

The company’s latest tactile sensing tech is designed to help robots understand pressure, texture, and contact in ways that go beyond simple touch sensors. At the center of the announcement are two products called Tacta and HexSkin, both aimed at solving a long-standing problem in robotics.

Humans rely heavily on touch to grasp objects, apply the right amount of force, and adapt instantly when something slips. Robots, on the other hand, usually operate with limited feedback. Ensuring Technology’s goal is to close that gap by recreating how human skin senses and processes touch.

Giving robots a sense of touch

Tacta is a multi-dimensional tactile sensor designed for robotic hands and fingers. Each square centimeter packs 361 sensing elements, all sampling data at 1000Hz, which the company says delivers sensitivity on par with human touch. Despite that density, the sensor is just 4.5mm thick and combines sensing, data processing, and edge computing into a single module.

At CES, Ensuring demonstrated a fully covered robotic hand using Tacta, with 1,956 sensing elements spread across fingers and palm, effectively creating a complete network of tactile awareness.

HexSkin takes the idea further by scaling touch across larger surfaces. Built with a hexagonal, tile-like design, HexSkin can wrap around complex curved shapes, making it suitable for humanoid robots.

CES 2026 has been packed with robots that show just how fast the field is moving, and why better touch matters. We’ve seen LG’s CLOiD home robot pitched as a household helper for chores like laundry and breakfast, alongside humanoid robots that can play tennis with impressive coordination and Boston Dynamics’ Atlas, which displayed advanced balance and movement this time.

While these machines already see and move remarkably well, most still rely heavily on vision and rigid sensors. Adding a human-like touch through artificial skin could be what finally makes robots feel a little more human.



Source link

━ more like this

Rivian made an electric ambulance for Grey’s Anatomy

America’s once-promising EV transition may have taken a U-turn, but at least some in Hollywood are trying to do their part. Rivian partnered...

A well-balanced Ryzen 9700X + RTX 5060 Ti desktop is $150 off right now

Shopping for a gaming desktop usually turns into a choose-your-own-adventure: build it yourself, hunt for parts, worry about compatibility, then lose a weekend...

Stop cramming groceries: this French door LG is discounted by $1,400

A new refrigerator isn’t a “fun” purchase, but it’s one of the upgrades you feel every day. Better organization, more usable space, and...

Samsung Galaxy S26 Ultra’s leaked renders show a familiar S25-style design

Official-looking renders and specs for Samsung’s upcoming Galaxy S26 Ultra have now surfaced online. With Samsung’s Unpacked event expected in late February 2026,...

Sundance doc ‘Ghost in the Machine’ draws a damning line between AI and eugenics

The Sundance documentary Ghost in the Machine boldly declares that the pursuit of artificial intelligence, and Silicon Valley itself, is rooted in eugenics.Director...
spot_img