Click For Photo: https://www.popsci.com/sites/popsci.com/files/styles/opengraph_1_91x1/public/images/2019/01/jabra_elite_85h_commutetitanium_black.jpg?itok=EEK3PFZVClick For Photo: https://www.popsci.com/sites/popsci.com/files/styles/655_1x_/public/images/2019/01/jabra_elite_85h_commutetitanium_black.jpg?itok=c2egTi5r
Here’s an example: Danish company Jabra announced their latest headphones at CES, the Elite 85h. They advertise the new $299 ‘phones as using “AI technology,” and they do, in the form of machine learning. They’re not “artificially intelligent” in the sense that they can read your mind and start talking to you, but the way they make use of AI is indeed smart.
Perhaps predictably, they call the feature in question SmartSound. “It listens to the environment the user is in,” says Fred Lilliehook, a senior product marketing manager at Jabra. “It automatically adapts the audio experience.”
Bus - Sound - Signature - Headphones - Commute
If you’re on a bus, it can recognize that sound signature, and then put the headphones in their “Commute” mode, meaning that active noise canceling kicks in. In a public space, like a sidewalk, the headphones switch into a mode called “In Public,” which triggers a feature called “HearThrough” that uses the mics—it has eight in total—to amplify the sidewalk sounds.
But first, the headphones had to learn how to do this. For that, they relied on a company it partially owns, called Audeering. “They have developed 6000 different sound characteristics that they use to analyze sound scenes,” says Lilliehook. “That means they can identify: what does a...
Wake Up To Breaking News!
The only change you ever get from the goverment is what's in your pocket, and worth less every day.