[ad_1]

Humane, the top-secret tech startup founded by ex-Apple vets Imran Chaudhri and Bethany Bongiorno, just showed off the first demo for its projector-based wearable at a TED talk. Axios’ Ina Fried broke the news, and Inverse has seen a recording of the full TED talk given by Chaudhri.

Journalist Zarif Ali, who had tweeted out an image of Humane’s wearable projecting a phone call function onto Chaudhri’s palm, says the full TED talk video is not slated to become available until April 22.

I’ve clipped out a demo of the AI-powered wearable in action:

After a quick summary on the fast rise and immense potential of AI and chatbots like ChatGPT, and even shouting out Bill Gates’s prediction that AI will be as profound as the graphical user interface that ushered in personal computing, Chaudhri shares his vision for the wearable.

“What do we do with all these incredible [AI] developments? And how do we actually harness these to genuinely make our life better?” he asks. “If we get this right, AI will unlock a world of possibility. Today, I want to share with you what we think is a solution to that end. And it’s the first time we’re doing so openly. It’s a new kind of wearable device, that and platform that’s built entirely from the ground up for artificial intelligence. And it’s completely standalone. You don’t need a smartphone or any other device to pair with it.”

“You don’t need a smartphone or any other device to pair with it.”

How does the phone function work? Designer Michael Mofina, who says he caught the TED talk live before the link was removed, told Inverse: “In terms of the call, as soon as [Chaudhri] raised his hand the device displayed the appropriate incoming call interface, no menu to navigate through.”

In a reply to a retweet of his image, Ali said that the Chaudhri demoed “a translation feature that translates to another language using your own voice model for natural conversation.” Axios reported that Chaudhri was translating his voice from English to French using the wearable. Per Mofina, “The translation came out in French but it was using an AI-generated version of his voice to speak it. No projected interface for the translation.”

Ali also described two other features. There’s “a ‘catch me up’ feature that scrapes your meetings, etc, and gives you a quick list of important things you may have missed. Mofina added this: “The device gave him a recap of crucial info without disturbing him with notifications. ‘You got an email, and Bethany sent you some photos.’”

Another function according to Ali: “a camera-enabled dietary feature that lets you check if you can eat a certain food based on your dietary restrictions.”

Mofina told Inverse that in another demo, the wearable “gave [Chaudhri] a specific answer about going shopping in a nearby district.” Such a feature would be useful while traveling.

And take a look at this video Mofina tweeted out. “Let’s say you’re health conscious or you have certain types of food considerations,” says Chaudhri. He takes out a candy bar, holds it in front of the device, taps on the Humane device and asks “Can I eat this?” The device responds with “A milky bar contains cocoa butter. Given your intolerance, you may want to avoid it.”

“What’s cool is my AI knows what’s best for me, but I’m in total control,” says Chaudhri. He taps on the wearable again. “Im gonna eat it anyway.” The AI replies back with some humor: “Enjoy it.”

Screenless, Seamless, Sensing

It’s been widely speculated that Humane’s “iPhone killer” would be a projector of sorts, and that appears to be the case. “AI will be the driving force behind the next leap in device design” reads a slide in Chaudhri’s presentation.

“If we get this right, AI will unlock a world of possibility,” says Chaudhri. “

In the below screenshot, you can see an image of the wearable device attached to Chaudhri’s jacket. There appears to be a camera and a pair (or more) of sensors.

Another slide reads: “It interacts with the world the way you interact with the world.” Chaudhri was presumably describing the wearable. A follow-up slide had these three words: “screenless,” “seamless,” and “sensing.”

One thing Mofina says Chaudhri shared at SXSW this year was that Humane’s wearable wouldn’t have a “wake word” like Siri or Alexa. “He was shown interacting with it by voice by tapping it to start speaking to it. It also has LED lights that indicate when it’s listening, and when a call is coming in.”

Several Twitter users have raised some important questions about how Humane’s wearable works, details of which were not shared in-depth at the TED talk.

“I wonder how much the Humane projector weighs? Will it weigh down a light shirt? Is it attached with a pin or a magnet?” tweeted MacRumors contributing writer Steve Moser. “What’s it like to accidentally shine it right into someone’s eyeball? Is there a recording light when the camera is on? How wide of an angle does it project?” We’re all wondering the same, Moser. We’re all wondering the same.

This is a developing story…

This article was originally published



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *