Q&A: Plantronics CTO on the Future of Headsets: Smart and Sensor-Laden

Joe Burton is the Senior Vice President of technology and development in strategy and CTO at Plantronics, one of the biggest, well-known headset companies in the world (and a key Avaya partner). I caught up with Joe at the Avaya Evolutions roadshow stop on San Francisco a few weeks ago, where he talked up Plantronics’ cutting-edge foray into wearable computing, which in its way is just as impressive as Google Glass.

Plantronics CTO Joe Burton

Joe Burton, CTO, Plantronics

Photo by Andres Larranaga/Avaya

One of the keynote speakers at Avaya Evolutions was astronaut Buzz Aldrin. There’s an interesting tie-in between with your company and our keynote: Every single word from the moon has been spoken through a Plantronics headset.

Burton: Absolutely. I mean, how cool is that to have Buzz here the same day that [Avaya SVP] Gary Barnett was talking about the great integrations between Avaya and Plantronics earlier today?

————————————————————————————————

This Avaya CONNECTED Blog is also available as an MP3 Audio File

————————————————————————————————

The headset is just a headset, right? It’s audio listening and speaking in a microphone. What do you guys do with unified communications? You don’t belong there.

Burton: Boy, I beg to differ, of course, Fletch, as we go through this (laughs). So certainly we have an integration between virtually every headset in the Plantronics line, and all of the Avaya products from the mobility products to unified communications, collaboration in the Contact Center. But the really interesting things that we’ve been doing that make us such a great fit with Avaya is actually we’ve started really thinking about the changing workplace and we’ve started adding intelligence to our headsets.

We know things like: Is the headset on your head or off of your head? Near or far from your mobile phone or your PC? And other information. And we’ve exposed all that information as well as other things through software programming APIs. So developers that use things like the Avaya collaboration environment can write highly productive applications, pulling in the knowledge of our wearable technology from our headsets.

So you play in the headset market across multiple segments. Telephony is just one piece of it. There are things like a firefighter man-down situation. Is the headset go from standing up to laying down, and how fast did it go there. That’s where the accelerometer data becomes important. One of your headsets has a nine-axis accelerometer. I found that out at another show, and that just amazed me.

Burton: That’s right. Literally, the one I have with me today is one of the … PLT Labs is our laboratory division, so PLT Labs Concept One headset, which we’ve actually released to the developer community, as you said, has a nine-axis accelerometer in it that will literally see which direction your head is pointing with very low latency, very high accuracy. And as you say, if something happens like you go from vertical to horizontal very quickly while wearing the headset, because remember I know if it’s on your head or not, then I know that it’s indeed a potential man-down situation. I can prompt you, “Are you okay?” or “Do you want me to call the authorities?” Because I know that it’s on your head or not, I know if you just dropped, probably we shouldn’t be calling 911.

Or throwing it against the wall or something, right?

Burton: You know, you would never do that as long as you’re wearing a Plantronics headset attached to an Avaya UC system (laughs).

But you can have somebody throw YOU against a wall with your headset on.

Burton: And that would be a headset on, and you might want to ask about a man-down situation.

That’s really interesting stuff. So you’re bringing in multiple axes in an accelerometer. You’re bringing in, I guess you would call it biometrics, right, with being able to sense if you’re on the head. What technology is that?

Burton: So there are many different technologies. We’re really entering the world of context and sensors, and the ability to actually bring all that into the headsets, into a piece of wearable technology is really interesting. However, doing it in a way where we’re exposing all that new data along with superior voice quality through a set of APIs really allows us to build unprecedented business value.

Fantastic. And to be here with Buzz Aldrin and when he was on the moon with Neil Armstrong and they uttered those famous words, “Tranquility Base, the Eagle has landed,” over your product, you have to proud to sit and think about that.

Burton: You know it’s just an amazing, amazing company to be representing with that kind of history. We still put that kind of quality and care into every product we build. It’s just terrific.

So where can somebody go to find out about being part of the community that looks at the API, the developers?

Burton: There’s really two different places they can go. Once again, our APIs are at plantronics.com/pdc, which is Plantronics Developer Connection. Or you can go to pltlabs.com and it will redirect you over there as well. Lots of sample code, lots of community interest, and myself and all my engineers will be happy to answer your questions and collaborate with you.

I met the guys from PLT Labs actually right here at this facility for WebRTC, and that’s where they showed me some of the really cool stuff they were doing with the multi-axis.

Burton: Well, it’s pretty interesting with all of the interest in WebRTC that Plantronics has actually won the most innovative WebRTC award the last two years in a row at the WebRTC conference. So clearly we’re on the right track with some of these biometrics.

Views All Time
Views All Time
2343
Views Today
Views Today
1