Q&A: Plantronics CTO on the Future of Headsets: Smart and Sensor-Laden

Joe Burton is the Senior Vice President of technology and development in strategy and CTO at Plantronics, one of the biggest, well-known headset companies in the world (and a key Avaya partner). I caught up with Joe at the Avaya Evolutions roadshow stop on San Francisco a few weeks ago, where he talked up Plantronics’ cutting-edge foray into wearable computing, which in its way is just as impressive as Google Glass.

Plantronics CTO Joe Burton

Joe Burton, CTO, Plantronics

Photo by Andres Larranaga/Avaya

One of the keynote speakers at Avaya Evolutions was astronaut Buzz Aldrin. There’s an interesting tie-in between with your company and our keynote: Every single word from the moon has been spoken through a Plantronics headset.

Burton: Absolutely. I mean, how cool is that to have Buzz here the same day that [Avaya SVP] Gary Barnett was talking about the great integrations between Avaya and Plantronics earlier today?

————————————————————————————————

This Avaya CONNECTED Blog is also available as an MP3 Audio File

————————————————————————————————

The headset is just a headset, right? It’s audio listening and speaking in a microphone. What do you guys do with unified communications? You don’t belong there.

Burton: Boy, I beg to differ, of course, Fletch, as we go through this (laughs). So certainly we have an integration between virtually every headset in the Plantronics line, and all of the Avaya products from the mobility products to unified communications, collaboration in the Contact Center. But the really interesting things that we’ve been doing that make us such a great fit with Avaya is actually we’ve started really thinking about the changing workplace and we’ve started adding intelligence to our headsets.

We know things like: Is the headset on your head or off of your head? Near or far from your mobile phone or your PC? And other information. And we’ve exposed all that information as well as other things through software programming APIs. So developers that use things like the Avaya collaboration environment can write highly productive applications, pulling in the knowledge of our wearable technology from our headsets.

So you play in the headset market across multiple segments. Telephony is just one piece of it. There are things like a firefighter man-down situation. Is the headset go from standing up to laying down, and how fast did it go there. That’s where the accelerometer data becomes important. One of your headsets has a nine-axis accelerometer. I found that out at another show, and that just amazed me.

Burton: That’s right. Literally, the one I have with me today is one of the … PLT Labs is our laboratory division, so PLT Labs Concept One headset, which we’ve actually released to the developer community, as you said, has a nine-axis accelerometer in it that will literally see which direction your head is pointing with very low latency, very high accuracy. And as you say, if something happens like you go from vertical to horizontal very quickly while wearing the headset, because remember I know if it’s on your head or not, then I know that it’s indeed a potential man-down situation. I can prompt you, “Are you okay?” or “Do you want me to call the authorities?” Because I know that it’s on your head or not, I know if you just dropped, probably we shouldn’t be calling 911.

Or throwing it against the wall or something, right?

Burton: You know, you would never do that as long as you’re wearing a Plantronics headset attached to an Avaya UC system (laughs).

But you can have somebody throw YOU against a wall with your headset on.

Burton: And that would be a headset on, and you might want to ask about a man-down situation.

That’s really interesting stuff. So you’re bringing in multiple axes in an accelerometer. You’re bringing in, I guess you would call it biometrics, right, with being able to sense if you’re on the head. What technology is that?

Burton: So there are many different technologies. We’re really entering the world of context and sensors, and the ability to actually bring all that into the headsets, into a piece of wearable technology is really interesting. However, doing it in a way where we’re exposing all that new data along with superior voice quality through a set of APIs really allows us to build unprecedented business value.

Fantastic. And to be here with Buzz Aldrin and when he was on the moon with Neil Armstrong and they uttered those famous words, “Tranquility Base, the Eagle has landed,” over your product, you have to proud to sit and think about that.

Burton: You know it’s just an amazing, amazing company to be representing with that kind of history. We still put that kind of quality and care into every product we build. It’s just terrific.

So where can somebody go to find out about being part of the community that looks at the API, the developers?

Burton: There’s really two different places they can go. Once again, our APIs are at plantronics.com/pdc, which is Plantronics Developer Connection. Or you can go to pltlabs.com and it will redirect you over there as well. Lots of sample code, lots of community interest, and myself and all my engineers will be happy to answer your questions and collaborate with you.

I met the guys from PLT Labs actually right here at this facility for WebRTC, and that’s where they showed me some of the really cool stuff they were doing with the multi-axis.

Burton: Well, it’s pretty interesting with all of the interest in WebRTC that Plantronics has actually won the most innovative WebRTC award the last two years in a row at the WebRTC conference. So clearly we’re on the right track with some of these biometrics.

Related Articles:

Avaya Demos Wireless Location Based Services at Avaya ENGAGESM Dubai

Wireless Location Based Services (WLBS) are usually discussed in the areas of customer or guest engagement. However, there are also valuable use cases in the areas of employee engagement and facility safety. The WLBS demo at #AvayaENGAGE in Dubai highlights the employee engagement use case. Further, it demonstrates the power of the Avaya Breeze™ Platform and Unified Communications.

As a real world example … think about a public area, a store, a hotel, school, etc. A window is broken. A call reporting the incident comes to the control center. The controller needs to identify which resources are closest to the event. The closest member of the security team needs to respond to cordon off the area and determine if anyone was injured. A member of the janitorial team needs to be dispatched to clean up the glass and a member of the engineering team needs to respond to temporarily cover the opening and have the glass company implement a replacement.

The WLBS display shows the location of all devices probing the WLAN. The user interface allows the controller to sort displayed devices by role, for instance, eliminating all guest devices from the display or simply displaying the security team members. Further, the device indicators can be color coded based on the role to simplify identification. Once the correct person is identified, they can be selected on the screen, and either sent an SMS or called on their mobile device. This allows the controller to quickly identify the appropriate resource based on their location and contact them to respond to the situation.

For the #AvayaENGAGE Dubai demonstration, Avaya employees are being tracked in the common areas of the pavilion. Information about each employee has been captured in a database, including MAC address, device phone number, name and skill or role at the event. For instance, subject matter experts (SMEs) in Networking, Contact Center, and Unified Communications have been identified. If a guest has a question requiring an SME, the closest SME can be identified and contacted to see if they’re available to answer questions.

The following diagram shows all devices being tracked by the 23 WAPs participating in the WLBS demo. There were 352 guests at the time the screenshot was taken, so most of the circles are light blue. However, if you look closely, you can see a few other colors, such as the dark blue Executive and the tan Network SME. Solid dots indicate the devices are connected to the Avaya WLAN. Hollow dots indicated that the device is probing the network, but not connected to the WLAN.

Wireless Location Based Services1

As you can see, an unfiltered display, while providing crowd level information, isn’t very helpful in finding specific people or skills. The filter selections on the right of the screen provide filtering functions. Displayed devices can be limited to one or more skills or by name.

The next screenshot shows filtering enabled for executives. The dot for Jean Turgeon (JT) was selected. At this point, the operator could select to send an SMS message to JT or call his mobile device.

Wireless Location Based Services2

The WLBS solution consists of three Avaya components:

The WLAN at #AvayaENGAGE Dubai is implemented with Avaya 9144 WAPs. Each 802.1 wireless network client device probes the network every few seconds to determine which WAPs are available to provide service. Every WAP within the broadcast range of the network device will detect and respond to the probe message. The probe and response messages enable better network service, particularly when the device is moving and needs to change WAPs to get better service. The probe messages are done at the MAC level, therefore, each WAP in the broadcast area receives a message from every MAC address in range every few seconds.

When location services are enabled in the 9100 WAP (simple non-disruptive change via web interface or profile update in Avaya WLAN Orchestration System), each WAP sends the MAC address and distance information to a network address. In this demo, the information is sent to a Avaya Snap-in that collects the data from all of the WAPs, sorts the data based on MAC address and runs the data through a triangulation algorithm to calculate the location of the client device based on the known locations of the WAPs.

A second Avaya Snap-in manages device identity management. This Snap-in could work with something like Avaya Identity Engines to provide user information for the MAC addresses detected by the WAPs. Since the #AvayaENGAGE Dubai demo is a temporary environment, the Snap-in simply provides the ability to load a CSV (comma-separated value) file with the Avaya employee information. This provides the ability to map Avaya employee identities to the MAC addresses of their mobile devices.

The user interface Snap-in provides the display shown above. It takes the output from the triangulation Snap-in and displays it on a map in a Web browser window. It also uses information in the identity Snap-in to sort devices owned by Avaya employees vs. Engage guests, hotel employees and other hotel guests. The skill classification captured in the CSV file enables finer level filtering and skill based color indication on the screen.

When the icon for an employee on the map is selected, the pop-up frame shown above appears. Communication to the Avaya employees is performed via the Zang cloud-based communication platform. When the user selects the SMS button shown above, a screen appears to enter the message, which is sent to the Zang service which then sends to the employee’s device. If the Call button is selected, the Zang service initiates a phone call between the number shown in the Call-me-at field above and the Avaya employee’s phone number listed in the CSV import.

I’d like to say this is rocket science, but the Avaya infrastructure components and Avaya Breeze make it straight-forward architecture. Avaya believes a key to scalability is putting power in the edge devices to minimize back haul data, but also to simplify management. The intelligence of the AOS software running in the 9100s makes it simple to collect device location information. The Breeze Platform provides a full JAVA-based programing environment with object classes for Avaya communication product functionality. Finally, Zang was designed for business people to be able to programmatically integrate communication functionality into business processes without a major investment in infrastructure or expertise.

Keep watching this space. We’re already planning for the WLBS Demo at Avaya Engage 2017, in Las Vegas, February 12-15.

Verbio Brings Voice Biometrics to Avaya Breeze™

If you’ve been following the Avaya Connected Blog in recent weeks, hopefully you’ve read about the changes Avaya expects to see in Customer Engagement as we roll out the Avaya Oceana™ Solution, a contact center suite for the digital age.

And perhaps you’ve read how Avaya Oceana is built upon the flexible platform of Avaya Breeze™, which offers extensibility through a Snap-in architecture, creating new opportunities to extend and customize customer and team engagement interactions further.

I’ve previously highlighted how some of our DevConnect Technology Partners are leveraging the Avaya Breeze Platform to do just that, and I’m happy to add Verbio to the growing list of value-added Snap-in vendors.

I had the opportunity recently to speak with Piergiorgio Vittori, who heads up Americas Sales and Global Partnership opportunities for Verbio, as they recently completed DevConnect Compliance Testing of their Verbio Voice Authentication Snap-in for Avaya Breeze. Piergiorgio indicated that it took “about two months, end to end” to bring this voice biometric solution to market, “including design and requirements, programming, testing, demos, tuning, and documentation.”

I daresay that there aren’t many ways to bring out a flexible, biometric-based capability set in that short of a timeframe, which I offer up as a tremendous proof point for how Avaya Breeze really simplifies key aspects of application and communication services integration.

Verbio’s solution, which couples a Breeze-based Snap-in with their core SaaS-based biometrics capabilities, extends the speech search and ASR/TTS capabilities inherent with Avaya Breeze to a new level of speech capabilities, while maintaining a consistent and familiar type of request and error handling methods to be leveraged by other application developers. The Snap-in itself simplifies many of the tasks associated with passing data to the Verbio engine, acting as a sort of Verbio-proxy for application developers already working in an Avaya Breeze environment.

Voice Biometrics has a number of potential use cases, especially when it comes to automated events and actions. From a security perspective the use of voice biometrics can help ward off social engineering hacks, while its application in contact center domains can increase agent utilization and reduce overall call time by eliminating the need to verify a specific users’ identity through numerous Q&A interactions. In this latter case, a users’ voiceprint can very much act like their conclusive identification.

Enterprises and contact center (or even public safety concerns) can further leverage voice biometric analytical capabilities as an emotion detector to determine whether the validity of the users request is being influenced by stress or emotional status.

All of which makes a great proof point for the power of Avaya Breeze in helping to transform how our customers conduct business in this digital age.

Finally… A Contact Center for the Digital Era

Imagine interacting with a company—any company—via your preferred method whenever you want, whether making a phone call, using online chat, sending an SMS, or messaging via Facebook. And, you end up having exactly the experience you were expecting. No not another bad experience. Rather an exceptionally pleasant and good experience, that surprisingly takes less time than you originally anticipated. Sounds too good to be true, doesn’t it?

The reality is that traditional business communications have failed to keep pace with consumer-focused technological devices. As a result, customers’ expectations—while very high—are rarely met when interacting with a business. Customers know what a good technology experience looks like, sounds like. The simplicity, built-in intelligence and sophistication of today’s devices and apps have taught customers that it’s not difficult to have a simple and tailored experience. It’s not difficult to teach a computer to know who you are, what you prefer, what you like to listen to, watch, read, how you like to interact. If it’s not difficult, then why are customer experiences with companies so predictably bad?

It’s time to break the mold. It’s time to start a new customer experience wave that makes customers happy about doing business with a company, excited that a company values their time and loyalty. Let’s give companies the freedom to be innovative, proactive, independent and capable of operating in real time to meet the demands and needs of their customers and agents without fail.

Unlike their traditional predecessors, today’s technological systems and capabilities have finally caught up with digital customer expectations. We have arrived at a point in time where the digital brain of a machine and the reasoning mind of a human are aligned closer than ever. Look at IBM Watson, Amazon Alexa.

Now, all that’s needed is similar thinking and innovation applied to business communications. What’s needed is the re-invention of the contact center for the digital era.

Enter Avaya Oceana™

Oceana is a departure from traditional business communications, just as the smart device was a departure from the basic, voice-only flip phone. This is a contact center for the digital era. Companies today don’t want to risk losing customers as a result of a bad experience. They can’t afford it. Companies want a single solution with best-in-class flexibility that gives them the ability to:

  • Drive adoption of self-service channels by seamlessly linking these into the contact center to deliver an omnichannel multi-touch experience.
  • Make agents more efficient and more effective by enabling them to handle multiple parallel interactions using an integrated multi-media desktop.
  • Reduce call times through utilizing contextual knowledge of prior and in-progress interactions to streamline customer interactions.
  • Drive higher customer satisfaction / NPS by tailoring the engagement experience to address their business’s unique/specific customer needs.
  • Rapidly optimize and continuously improve how they engage with their customers by leveraging their system’s flexibility, openness and integration capabilities.

In turn, customers get the experiences they know modern-day technology is capable of providing. The sophisticated yet simple and intelligent experiences they have grown accustomed to having with their smart devices, tablets, laptops, digital televisions and other smart appliances. This is Oceana.

Avaya Is the Innovative Leader for the Digital Era

When we made the decision and accepted the challenge to lead the industry in re-inventing the contact center, we did not enter into this without careful thought and consideration … of everything. Leaving no stone unturned, we diligently looked at what our customers and our competitors’ customers are working with today. We uncovered more cobbled together, Frankenstein-esque systems than I ever thought existed. The complexity of processes and user experiences that companies unintentionally created by not keeping up with technology upgrades was amazing. Holding on to these older technologies today when all of these digital capabilities are available is similar to keeping a shelf full of CDs for your music. Eventually you realize that making the choice to go digital can transform your world for the better.

What we also learned was that if we were going to re-invent the contact center for the digital era, we had to think differently because digital technology and digital customers require different thinking. We challenged each other daily to think differently. That’s been the biggest challenge all along for companies struggling through digital transformations. But as Avaya learned with our own transformation, once you make the hard decisions, start thinking differently and get to the other side of the transformation, a whole new world of opportunity becomes available to you.

As Oscar Goldman used to say during a time when human-to-machine technology was something only Hollywood could dream up, “Gentlemen, we can rebuild him. We have the technology.” The truth is we finally do have the technology. We can rebuild the contact center. This is no longer a futuristic endeavor, this is now. In fact, let me rephrase that: we have rebuilt the contact center.

This is Oceana. This is the contact center for the digital era. This is the start of something new, the start of something big.