This artificial skin could give ‘human-like’ sensitivity to robots

Date:

Share:


Robots are getting better at seeing, hearing, and moving, but touch has always been the missing piece. At CES 2026, Ensuring Technology showcased a new kind of artificial skin that could finally give robots something close to human-like sensitivity, helping them feel the world instead of just bumping into it.

The company’s latest tactile sensing tech is designed to help robots understand pressure, texture, and contact in ways that go beyond simple touch sensors. At the center of the announcement are two products called Tacta and HexSkin, both aimed at solving a long-standing problem in robotics.

Humans rely heavily on touch to grasp objects, apply the right amount of force, and adapt instantly when something slips. Robots, on the other hand, usually operate with limited feedback. Ensuring Technology’s goal is to close that gap by recreating how human skin senses and processes touch.

Giving robots a sense of touch

Tacta is a multi-dimensional tactile sensor designed for robotic hands and fingers. Each square centimeter packs 361 sensing elements, all sampling data at 1000Hz, which the company says delivers sensitivity on par with human touch. Despite that density, the sensor is just 4.5mm thick and combines sensing, data processing, and edge computing into a single module.

At CES, Ensuring demonstrated a fully covered robotic hand using Tacta, with 1,956 sensing elements spread across fingers and palm, effectively creating a complete network of tactile awareness.

HexSkin takes the idea further by scaling touch across larger surfaces. Built with a hexagonal, tile-like design, HexSkin can wrap around complex curved shapes, making it suitable for humanoid robots.

CES 2026 has been packed with robots that show just how fast the field is moving, and why better touch matters. We’ve seen LG’s CLOiD home robot pitched as a household helper for chores like laundry and breakfast, alongside humanoid robots that can play tennis with impressive coordination and Boston Dynamics’ Atlas, which displayed advanced balance and movement this time.

While these machines already see and move remarkably well, most still rely heavily on vision and rigid sensors. Adding a human-like touch through artificial skin could be what finally makes robots feel a little more human.



Source link

━ more like this

Gaming industry has embraced AI, but most game developers still think it’s bad

The gaming industry is experimenting with AI faster than ever, but it is doing so under a cloud of anxiety. A new industry-wide...

I Let Google’s ‘Auto Browse’ AI Agent Take Over Chrome. It Didn’t Quite Click

When I was finally able to experiment with Auto Browse (for real this time) I took Google’s suggestions of digital chores as my...

NASA used Claude to plot a route for its Perseverance rover on Mars

Since 2021, NASA's Perseverance rover has achieved a number of historic milestones, including sending back the first audio recordings from Mars. Now, nearly...

Rivian made an electric ambulance for Grey’s Anatomy

America’s once-promising EV transition may have taken a U-turn, but at least some in Hollywood are trying to do their part. Rivian partnered...

A well-balanced Ryzen 9700X + RTX 5060 Ti desktop is $150 off right now

Shopping for a gaming desktop usually turns into a choose-your-own-adventure: build it yourself, hunt for parts, worry about compatibility, then lose a weekend...
spot_img