Amazon’s smart glasses with AI will help its drivers deliver packages faster

Date:

Share:


Amazon has revealed that it’s currently working on smart glasses designed for delivery drivers, confirming previous reports about the project. The company said that glasses use AI-powered sensing capabilities and computer vision to detect what their cameras are seeing. Drivers then get guidance through the glasses’ heads-up display (HUD) embedded right into the lens. Based on Amazon’s announcement, it’s been working on the glasses for a while, and hundreds of delivery drivers had already tested early versions to provide the company with feedback.

The glasses automatically activate after the driver parks their vehicle. They then show users the right packages to deliver, according to their location. Users will see the list of packages they have to take out on the HUD, and the glasses can even tell them if they pull out the right package from their pile. When they get out of their vehicle, the glasses will display turn-by-turn navigation to the delivery address and will also show them hazards along the way, as well as help them navigate complex locations like apartment buildings. Simply put, the device allows them to find delivery addresses and drop off packages without having to use their phones. Drivers will even be able to capture proof of delivery with the wearable.

Amazon’s glasses will be paired with a vest that’s fitted with a controller and a dedicated emergency button drivers can press to call emergency services along their routes. The device comes with a swappable battery to ensure all-day use and can be fitted with prescription and transitional lenses if the drivers need them. Amazon expects future versions of the glasses to be able to notify drivers if they’re dropping a package at the wrong address and to be able to detect and notify them about more hazardous elements, like if there’s a pet in the yard.

In the annual event wherein the company announced the device, Amazon transportation vice president Beryl Tomay said it “reduces the need to manage a phone and a package” and helps drivers “stay at attention, which enhances their safety.” She also said that among the testers, Amazon had seen time savings of 30 minutes for a given shit.

The company didn’t say anything about developing smart glasses for consumers, but The Information’s previous report said that it’s also working on a model for the general public slated to be released in late 2026 or early 2027.



Source link

━ more like this

ChatGPT’s Horny Era Could Be Its Stickiest Yet

He sees erotic bots as “one part of your spectrum of relationships,” rather than a replacement for human connection, where users can “indulge...

Dollar firms as yields rebound ahead of key inflation data – London Business News | Londonlovesbusiness.com

The dollar index rose slightly on Thursday, supported by a broad rebound in US Treasury yields, with the 2-year and 10-year maturities recovering...

Gap is to return back to the high street – London Business News | Londonlovesbusiness.com

The American fashion giant Gap is to return back to the high street with three new stores to open in London in time...

New immigration rules could stifle SME growth and innovation – London Business News | Londonlovesbusiness.com

The UK’s major political parties continue to grapple with the issue of  immigration policy – the latest being the new requirement for A-Level...

From garage start-up to £6m turnover: The ShuttersUp founders’ story – London Business News | Londonlovesbusiness.com

Two carpentry-trained friends who once juggled night shifts on the London Underground with daytime shutter fitting have grown from a garage-based two-person start-up...
spot_img