Ready for the M2 Generation?

This Avaya CONNECTED Blog
is also available as an MP3 Audio File


NENA 2013 Logo
This past week at the NENA 2013 conference and Expo in Charlotte North Carolina, the big topic was text to 911 becoming a reality as a follow on to the agreement signed by the four major wireless carriers, NENA, APCO and the FCC.

Although in the future, pictures, video and other forms of multimedia will find their way into the 911 center, it’s commonly agreed that the “low hanging fruit” is allowing text messaging to reach 911 centers directly by addressing the digits 911 as the destination, and the commitment is to deliver this by 2014.

In the past, there has been some hesitancy since text messaging is what is known as a store and forward technology that provides minimal levels of service guarantees. Although this is technically true, during times of natural disaster, such as experienced with super storm Sandy, the switched voice telecommunications infrastructure was out of service, although text messaging still managed to carry through much of the time.

This is caused the change in thinking within the industry, in that although text messaging does have its deficiencies, during times where all other means of communications become unavailable, text messaging may just be one of the last working means of communication able to reach public safety.

Of course the flip side of that story is text messaging being used by persons who are deaf, deaf blind, and hard of hearing as well as individuals with speech disabilities. For this group of folks, text messaging is a primary mechanism for communicating, as TTY machines are far too bulky to carry around, clumsy to operate with acoustic couplers or requiring a direct connection into a telephone line and even then, they often are meet with communications or technology failures at the PSAPs even though each center must maintain a working TTY.

Regardless of that, let’s assume that the industry will self right itself, solve many of these problems, and move forward with multimedia Next Generation 911 communications.

For a the bulk of us in the latter part of our careers, we seen Generation X, and Generation Y, but we are now faced with generation M2 or M squared. This generation is characterized by the lives of 8 to 18-year-olds today, their immersion in multimedia (M2) and communicating using new technologies never before available.

There will come a time when “calling someone” will become as antiquated as “dialing a phone” (remember those rotary phones with dials?) And a new phrase will crop up. “Hey! When you get to where you’re going, make sure you ‘COMM’ me and let me know that you got there okay.”

Once that becomes the average day-to-day communications method, emergency communications is going to have to rapidly adapt. The amount of additional data flooding an emergency center with each and every communication event will be overwhelming for an individual, and just as Run Recommendations are provided to call takers and dispatchers today based on unit availability and automatic vehicle location (AVL) positioning, I envision a futuristic heads-up display for the call taker.

In addition to the 260 characters or so that are available on ANI and ALI, personal medical record information, building information, data from smart building monitors in the way of temperature sensors and other information, will all be correlated for the call taker using Computer Assisted Automation.

Think of it as a “Run Recommendation” on steroids. It’s not an autonomous system that is taking control of the situation, it simply computer automation analyzing and modeling current and projected variables, and matching that against a pre-established database of potential outcomes. Based on these facts, and past history, the recommended action is this. What “this” is, is entirely up to your imagination and will be the prime category for the patent trolls.

So the moral of this week’s blog, is “NG911 is NOT coming; it’s already here.”


Want more on E9-1-1?  E9-1-1 Talk Podcast
Subscribe to my weekly E9-1-1 Talk Podcast here

Thanks for stopping by and reading the Avaya CONNECTED Blog on E9-1-1, I value your opinions, so please feel free to comment below or if you prefer, you can email me privately.

Public comments, suggestions, corrections and loose change is all graciously accepted 😉
Until next week. . . dial carefully.

Be sure to follow me on Twitter @Fletch911

Fletch_Sig.png 


CacheFly LogoAPN is Powered by Cachefly
CacheFly is the world’s fastest CDN, delivering rich-media content up to 10x faster than traditional delivery methods. With a proven track record and over a decade’s worth of CDN experience, companies around the world choose the CacheFly CDN for reliable and unbeatable performance. For more information, visit www.cachefly.com

Related Articles:

Connected Health: The Digital Transformation of Care Innovation

All around the world, across the spectrum of disease, IT is changing our approach to chronic conditions and how we approach connected health. Text messages remind people living with HIV to take their medication and keep their medical appointments. Smartphone apps diagnose post-traumatic stress disorder by analyzing a user’s voice. Online forums enable breast cancer patients and survivors to trade information related to every stage of their care.

Collectively known as “connected health,” these recent, IT-driven innovations represent the intersection of digital technology and care. They’re transforming not only the way people manage their own health, but also the way they interact with their healthcare providers.

Unintended, but welcomed, consequences

By and large, connected health is an adaptation of technologies that were originally developed for other purposes. Mobile technology started out as a voice communication tool. Instant messaging was an outgrowth of online chat rooms. Social media became a means for making new friends.

Now these technologies have evolved and converged in a way that is overcoming formerly intractable barriers to care. By minding the agenda of day-to-day care, for instance, they give people the opportunity to stay in adherence with their treatments even where clinical visits are impractical due to cost, distance or availability. And by helping patients preserve their privacy, make sense of their conditions, and learn from others with similar experiences, health IT can lift the stifling veil of stigma from disease. 

The implications don’t stop with the individual. Connected health also helps people manage their own disease state so they don’t spread it to others. Across whole populations, it can allow interventions aimed at preventing chronic diseases, such as behavioral modifications that reduce the incidence of obesity.

Changing care innovation paradigms

In all these respects, connectivity is bringing to medicine a level of accountability and democratization that seemed unimaginable not so long ago. But it’s also dialing up the urgency of some unanswered questions. Among them:

  • What information is appropriate to gather? Not all information has value in a healthcare setting.
  • Will information remain proprietary? It’s unclear to what extent stakeholders are willing to advance the interests of the community ahead of the interests of a company.
  • What would a sharing paradigm look like? If companies were to share information, they would need a seamless, cohesive way to do it.
  • How will privacy and security be preserved? Artificial intelligence and machine learning are critical pieces of this equation.
  • How will healthcare use technologies to create new models of care? Today’s applications are largely geared toward improving quality and outcomes of existing care models.

There’s no one-size fits all solution to these questions. Neither is care innovation strictly a technology issue. Technologists must collaborate with clinicians, patients, and patient advocates to take care coordination and operational efficiency to the next level in helping people cope with long-term diseases. A new, technology-powered paradigm—one that transcends existing constraints of time and resources—can bring a welcome transformation in the ongoing management of care coordination and the patient experience.

Avaya Equinox, Now with Team Collaboration, Just Got More “Go-To”

 

I recently read that the Apple App Store now contains about 2.2 million apps. It’s an amazing number and a testament to the creativity of developers and the variety of our human interests and needs. But it made me wonder: how many apps can we really use on a regular basis & for what? Are they for fun? Are they informative? Do they increase team collaboration? If your smartphone is like mine, you’ve got a number of go-to apps that you use regularly, let’s say weekly, and probably a few you use daily or almost constantly. Then there are the Tier 2 apps, hiding in your folders that seldom see the light of day. It’s fun to delve into these folders every few months and rediscover the apps that I thought looked so interesting at the time but now languish for months on end.

What’s fun for personal apps however, can often become a nightmare in the work world. We all have someone in the office that has that need to be first with the latest hot app, to provide their take on what’s cool and what’s not and make everyone else feel a little short of the mark for not using it first. Of course most of these apps get frenzied activity for about 3 ½ days and then slip into oblivion. The issue for most of us is we simply have too much on the go to be constantly changing the way we work and coercing others to adopt our favorite app of the week.

What my work day really needs is a true go-to app. One that makes me more productive, more reachable, more on track and that lets me get to my tasks and meetings with a single touch. If you’ve read my previous blogs, you know where I’m going with this: my go-to app is Avaya Equinox®. With its “mobile-first” Top of Mind screen, it provides me with at-a-glance visibility to meetings, instant messages and my call history giving me a single place to keep up to date and productive regardless of where my day may take me.

I’m happy to say that my go-to app just got more, well, “go-to”. The Avaya UC experience that I rely on every day is now being extended with the integration of a cloud-based team collaboration capability.  It gives me the full benefits of a team work environment that integrates voice, video, persistent team chat and messaging, along with file and screen sharing, all from within the Avaya Equinox experience.

Let me give you an example of these new Equinox team collaboration capabilities in action. I’m currently working with an external vendor on a major project. Our work will carry on for several quarters with new materials being created that need review, discussion, and likely several rounds of back and forth. To get the project kicked off and a vendor selected, we needed the full gamut of collaboration capabilities from simple voice calls to several all-day video conferences with participants joining from around the world – something easily managed with Avaya Equinox. 

The next step was to establish a core team and shift into a regular cadence of interaction. Adding the participants to the team collaboration space from both inside and outside Avaya was a snap and we were instantly able to communicate with one another – I use one to one instant messaging for small items or questions and chat when I want to involve the entire team for broader issues. Tasks get assigned within Avaya Equinox to keep our review cycles on track and we use the file sharing capability avoid clogging up our email. If I’m off line at some point, due to travel or other activity, a quick glance at Avaya Equinox gets me back up to speed with the team’s progress.

On a weekly basis, we usually need some face time, and Avaya Equinox provides complete meeting capabilities including audio / video conferencing with screen sharing so we all gain the advantages of personal interaction. No matter where we are or what we are doing, we can all collaborate on content in real-time – it’s more productive and prevents misunderstandings across a widely distributed team. 

In many ways our team collaboration space has become a virtual “war room”.  Information is clearly visible and easily shared, I can see who’s available at any time and formal and informal discussions can be initiated with ease.

There’s no shortage of apps available to anyone with a mobile device and the time to spend browsing around an app store. The real challenge is finding those few go-to apps that you’ll use every day. If you aren’t using Avaya Equinox yet, I’d encourage you to give it a try. I think it will make your short list of “go-to” apps and in a month or two, you might wonder how you got through your day without it!

Building SMS Text Bots is a Breeze

As a nerdy guy, I love movies about other nerdy guys. Give me movies like “A Beautiful Mind,” “The Theory of Everything,” or “Einstein and Eddington” (two nerdy scientists for the price of one), and I am in geek heaven. Recently, I was thrilled by “The Imitation Game”—the story of Alan Turing and his quest to break Germany’s WWII secret code. While I would never dare to compare myself to Mr. Turing, I like to think that we would have a few things in common. One area would be our shared interest in natural language processing and intelligent behavior.

Way back in 1950, Turing crystallized his research into these studies in what has become known as The Turing Test. Simply put, The Turing Test is a test of a machine’s ability to impersonate a human being. For a machine to pass The Turing Test, it must be able to participate in a conversation with a human being to the point where the human doesn’t realize that he or she is interacting with a machine. I can only imagine what Turing would think of today’s technology such as Siri, Alexa, and Google Home. Better yet, imagine Alan conversing with the robot, Sophia. Would he be excited or frightened? Personally, I am a little of both.

Real or Not

If you have been reading my articles on No Jitter and here on the Avaya blog, you know how enamored I am of the Breeze and Zang workflow designers. Although I have spent the bulk of my professional life writing software in programming languages such C++ and Java, I have fallen in love with how quickly I can use the Breeze/Zang tools to go from idea, to prototype, to a production-quality application. I like to say that if you can draw it on a whiteboard, you can “code” it with Breeze.

So, the day I decided to build a text bot, I knew exactly how I was going to do it. Starting with a list of things I wanted my text bot to do, I was soon drawing out message flows and decision points (if this, do that). Once I was happy I had captured all the salient points, I turned to my computer and began typing. Early on, I realized that there was no way on earth I could capture all the different text messages my application would need to process. For instance, how many different ways can you ask for the location of a store? “Where are you located?” “What is your address?” “What city are you in?” “How can I find you?” The variations are nearly endless.

To solve this problem, I turned to natural language processing (NLP) and artificial intelligence (AI). That, of course, led me to the 500-pound gorilla in the room—IBM Watson. With Watson, I can build “Conversations” that allow me to create intents, entities, and dialogs. Intents are used to classify a request. You can think of entities as modifiers to those intents. Dialogs are the words you want to “speak” after determining the intent.

For example, consider the phrase “Are you open on Sunday?” Here, the intent could be classified as “hours.” The entity is “Sunday.” A proper dialog could be, “We are open on Sunday from 12:00 to 5:00.” To keep things simple, I created three intents for my bot: Directions, Holidays, Hours. Those intents resulted in three dialogs. I left off entities for now.

Building SMS Text Bots is a Breeze-Img1

 

My next decision point had to do with maintaining a conversation over many text messages. For that I choose Avaya’s Contest Store, which allows me to temporarily store information about a text conversation. This information can then be accessed over the life of the chat.

Building SMS Text Bots is a Breeze-Img2

Now that I had an engine to process incoming text messages (Watson), and a method of maintaining a chat’s context (Contest Store), it was time to launch the Avaya Breeze Engagement Designer. I will admit that I still had a few logic problems to work through, but I would not be stretching the truth if I said that I had a rough draft of my text bot up and running in less than an hour. Working through those remaining issues consumed another couple of hours, but in a fraction of the time it would take me to write my application in Java, my bot was accepting text messages, building contexts, and texting back replies.

Building SMS Text Bots is a Breeze-Img3

I should also say that my bot is fully multi-user. It didn’t matter if one or one hundred people were all texting in at the same time. My bot kept track of each individual conversation and no one received a text meant for someone else.

 
Building SMS Text Bots is a Breeze-Img4
 

While my example bot is fairly simple in terms of what it can handle, the framework is extendable to just about any SMS conversations you might want to support. Future plans have me using Context Store to save the entire conversation between human and machine. Not only could this be useful for determining how accurately my bot responds to incoming requests, but it could also be used to help better serve customers. A recorded chat sessions could be presented to a human agent in the case where the user moved from text to a phone call.

Next, I would love to incorporate some of the other features that Watson provides. For example, by detecting the tone/sentiment of the conversation, my bot could sense if the human was becoming frustrated with the answers he or she was receiving from my bot. This would allow the bot to either escalate the chat to a live agent, or have an agent follow up afterwards to help soothe over what might have been an unpleasant experience – or both.

Mischief Managed

Human to human conversations aren’t going away anytime soon, but more and more machines are going to step in to handle the easy to moderately hard stuff. The point is not to trick people into thinking they are talking to a human being. The point is that machines can handle tedious jobs without coming across as machines.

While I highly doubt that anyone will ever make a movie about Andrew and his fabulous text bots, it isn’t all about fame and glory, right? This is exciting technology and the fact that I can use Breeze to create sophisticated bots by easily combining powerful, but disparate technologies, is red-carpet stuff.