Team Engagement: The Value is in the Business Processes

For unified communications vendors like Avaya, developing technologies, features and capabilities represent the table stakes, while delivering business value on top of that foundation leads to market success.

Developing and using great new technology is thrilling for technology-centric organizations: just say WebRTC, H.265, directory federation, zero-trust security, or even single sign-on in a technically-competent circle and watch the conversation blossom.

Technology isn’t enough. Knowing and implementing the APIs and SDKs, protocols and standards, features and capabilities, even graphical and voice-activated user interfaces represent the important “table stakes” in developing and delivering unified communications to market. It is not, in my opinion, the route to market success or customer satisfaction.

Many market leaders today are trying to shift the conversation from technologies and capabilities to user experiences. Avaya has embraced this shift in part by referring to our traditional unified communications portfolio as Team Engagement Solutions–focusing on the value we provide to our users, and not the features we deliver.

This is a great first step in actually articulating the business value of implementing unified communications by concentrating on how information workers take the solution in hand to help complete their job.

User experience should not be confused with user interface. The user interface design team concerns itself with menu tree optimization, presentation of options, and time/motion studies to access the capabilities of the system; in short, what a person does to interact with a system. User experience starts from that foundation and considers the user’s relationship with the solution and how they integrate that into their daily processes–in short, how they feel when they use the system.

It is possible for a solution to have a world-class user interface and a terrible user experience.

Think of the way Mark Zuckerberg organizes the Facebook offices: Web designers who create the look and feel of Facebook are within arm’s reach of his desk, while the architects and engineers are a layer away. He knows where they are when he needs them, but they are not in his face.

Mirroring his users’ experience, the engineers who create the architecture that makes the user experience possible are hidden from view, just a few steps away from the Web designers who deliver Facebook’s public persona and value.

This brings us to the central issue–as unified communications solutions providers have learned the hard way over the past decade, simply replacing a PBX, interfacing with an email solution, connecting to a corporate directory, and delivering video connections is not enough. Framing your UC solutions as a replacement for existing and developing solutions sets vendors up to compete in a price war, robbing their company of the financial resources required to rapidly deliver future innovation that enables their customers to use communication and collaboration as a competitive advantage.

Knowing the business value of communication and collaboration–the processes that are both critical to achieving competitive success and are most likely to be bogged down by delayed human interaction–is key to selling the value of unified communications or Team Engagement.

Even a large company like Avaya is not able to have deep relationships with every buyer, user and decision-maker in the market to really drive every sale–particularly when we address the midmarket. This is why Avaya has committed to the channel and supports distributors and resellers that innovate in the delivery of solutions.

As an evangelist for cloud-based solutions, I see the technical components and capabilities required for engagement solutions to succeed in the market rapidly are out of the lab and ready to be implemented across our ecosystem.

With core technology from Avaya Labs delivering world-class capabilities, a cloud-based architecture to deliver upgrades, new services and manage configurations, and an ‘as a Service’ business model to flex capabilities and volumes with business needs, OnAvaya solutions allow our business partners to work closely with their customers to solve business problems, instead of resolving technology issues.

Avaya has a wide variety of service providers, systems integrators, and cloud-focused companies proving the value of OnAvaya cloud solutions by winning deals. Danish service provider, Cirque, was serving 5,000 users nine months after going live with an Avaya-powered hosted communications solution.

When HP and Avaya agreed to a partnership to deliver cloud-based engagement solutions, Mike Nefkens, EVP and General Manager, HP Enterprise Services said, “The partnership with Avaya supports HP’s larger vision for the New Style of IT.”

Avaya is also working with Google to deliver solutions that combine the engagement expertise from Avaya, and the cloud expertise from Google.

We have service providers and systems integrators proving this value by winning deals at marquee clients like Verizon selling OnAvaya cloud-based engagement solutions to federal and state governmental agencies, or HP signing five new contracts with enterprise customers to deploy OnAvaya Cloud solutions in the first quarter of our joint cloud initiatives, or working with Google and their partners to sell OnAvaya solutions to clients like Vegas.com.

At the end of the day, this makes the business partner a trusted advisor who is able to engage in conversations about how to evolve business processes–and that is the great promise of cloud and engagement solutions.

Related Articles:

Continuous Learning: Propelling Forward in a Rapidly and Inevitably Changing World

Whether we realize it or not, advanced technologies like artificial intelligence (AI), augmented reality, and the Internet of Things (IoT) have transformed the way we think about the world around us. From how we protect our schools to the way we navigate our streets to how we shop for groceries, such technology now lies at the heart of practically everything we do today.

Just as these technologies have changed the way we live, they have changed the way we work. Today’s rapid pace of innovation has transformed nearly every business task, process, and workflow imaginable—so much so that industry analysts estimate that up to 45% of activities that employees are paid to perform can now be automated.

This digital disruption—or what many are calling the Fourth Industrial Revolution—without question redefines traditional roles and responsibilities. In fact, research shows that in five years, more than one third of skills that are considered important in today’s workforce will have changed. Even more, analysts estimate that 65% of children today will grow up to work in roles that don’t yet exist.

While we do still see employees that specialize in one skill or expertise, we’ve mostly moved away from the days of hiring an employee for just one job. As technology evolves, so too do the skills required to innovate and propel forward. Looking ahead, employees must have a propensity for continuous learning and adopting new skills to be able to recognize and respond to today’s speed of digital change.

Consider how technology has changed the marketing paradigm. As recently as 10 years ago, marketing platforms like Marketo and HubSpot had only just been founded, Facebook was still in its infancy, and the first iPhone had newly hit the market. As technologies like cloud, social, mobile and big data evolved, however, we suddenly began seeing new tools specifically designed to enhance digital media, social media marketing, and mobile marketing. As a result, companies began searching to fill roles for social media coordinators, digital campaign managers and integrated marketing planners—jobs that were unfathomable 15 to 20 years prior.

Fast forward to today and we’re seeing the emergence of new technology for marketing, such as augmented reality, geofencing, and emotion detection. The continual emergence of new technology perpetually creates skills gaps that must be filled by employees who are passionate, motivated, and invested in their own learning. These kinds of team members are committed to developing new skills and leveraging their strengths to outperform.

But not all employees can easily identify their strengths or develop new skills. This is likely why nearly half of employees today feel unengaged at work, with nearly 20% feeling “actively disengaged.” At the same time, companies are struggling to align employee strengths with organizational priorities. Employees may have certain strengths, but employers may find those skills don’t directly increase operational efficiency or performance. This is why nearly 80% of businesses are more worried about a talent shortage today than they were two years ago.

So, what’s the answer? Employees and employers must work together to identify what roles are currently filled, what skills are still needed, and who best exemplifies those skills. For employees, this means taking control of how they grow their careers and improving for the better. For employers, this means displaying an unwavering commitment to employee reinvestment by understanding key areas of interest to effectively fill skills gaps.

At Avaya, for example, we’re leading an employee enablement program under our Marketing 3.0 strategy. The initiative is designed to help strengthen our marketing organization by equipping employees with the right competencies that reflect our culture, strategy, expectations and market dynamics. By doing so, we can ensure we’re recruiting and managing talent in the most strategic way, putting the right people in the right jobs with the abilities to perform at maximum potential every day. By having each marketing function participate in a simple knowledge profile exercise, we can begin objectively determining development opportunities that best meet their needs and the needs of our business.

As technology continuously evolves, it’s crucial that employees have a propensity for continuous learning and that organizations foster an environment for this learning. In the words of former GE CEO Jack Welch, “An organization’s ability to learn, and translate that learning into action rapidly, is the ultimate competitive advantage.”

We live in a world that is rapidly and inevitably changing. Employees should embrace this change to thrive, and must if they wish to propel business forward. As employers, we are responsible for strategically leveraging our resources to align employee strengths with organizational needs to succeed in this environment of constant change.

Next-Generation IT: What Does It Really Look Like?

From mainframes to virtualization to the IoT, we’ve come a long way in a very short amount of time in terms of networking, OS and applications. All this progress has led us to an inflection point of digital business innovation; a critical time in history where, as Gartner puts it best, enterprises must “recognize, prioritize and respond at the speed of digital change.” Despite this, however, many businesses still rely on legacy systems that prevent them from growing and thriving. So, what’s the deal?

I attempted to answer this in a previous blog, where I laid out as entirely as I could the evolution of interconnectivity leading up to today. What was ultimately concluded in that blog is that we have reached a point where we can finally eliminate dependency on legacy hardware and hierarchical architecture with the use of one single, next-generation software platform. The call for organizations across all industries to migrate from legacy hardware has never been stronger, and the good news is that technology has evolved to a point where they can now effectively do so.

This concept of a “next-generation platform,” however, isn’t as simple as it sounds. Just consider its many variations among industry analysts. McKinsey & Company, for example, refers to this kind of platform as “next-generation infrastructure” (NGI). Gartner, meanwhile, describes it as the “New Digital Platform.” We’re seeing market leaders emphasizing the importance of investing in a next-generation platform, yet many businesses still wonder what the technology actually looks like.

To help make it clearer, Avaya took a comparative look at top analyst definitions and broke them down into five key areas of focus for businesses industry-wide: 

  1. Next-generation IT
  2. The Internet of Things (IoT)
  3. Artificial intelligence (AI)/automation
  4. Open ecosystem
  5. The customer/citizens experience

In a series of upcoming blogs, I’ll be walking through these five pillars of a next-generation platform, outlining what they mean and how they affect businesses across every sector. So, let’s get started with the first of these: next-generation IT.

Simplifying Next-Gen IT

As IT leaders face unrelenting pressure to elevate their infrastructure, next-generation IT has emerged as a way to enable advanced new capabilities and support ever-growing business needs. But what does it consist of? Well, many things. The way we see it, however, next-generation IT is defined by four core elements: secure mobility, any-cloud deployment (more software), omnichannel and big data analytics—all of which are supported by a next-generation platform built on open communications architecture.

Secure mobility: Most digital growth today stems from mobile usage. Just consider that mobile now represents 65% of all digital media time, with the majority of traffic for over 75% of digital content—health information, news, retail, sports—coming from mobile devices. Without question, the ability to deliver a secure mobile customer/citizen experience must be part of every organizational DNA. This means enabling customers to securely consume mobile services anytime, anywhere and however desired with no physical connectivity limitations. Whether they’re on a corporate campus connected to a dedicated WLAN, at Starbucks connected to a Wi-Fi hotspot, or on the road paired to a Bluetooth device though cellular connectivity, the connection must always be seamless and secure. Businesses must start intelligently combining carrier wireless technology with next-generation Wi-Fi infrastructure to make service consumption more secure and mobile-minded with seamless hand-off between the two technologies.

Any-cloud deployment: Consumers should be able to seamlessly deploy any application or service as part of any cloud deployment model (hybrid, public or private). To enable this, businesses must sufficiently meet today’s requirements for any-to-any communication. As I discussed in my previous blog, the days of nodal configuration and virtualization are a thing of the past; any-to-any communications have won the battle. A next-generation platform built on open communications architecture is integrated, agile, and future-proof enough to effectively and securely support a services-based ecosystem. Of course, the transition towards software services is highly desirable but remember not all hardware will disappear—although where possible it should definitely be considered. This services-based design is the underlying force of many of today’s greatest digital developments (smart cars, smart cities). It’s what allows organizations across every sector to deliver the most value possible to end-users.

Omnichannel: All communication and/or collaboration platforms must be omnichannel enabled. This is not to be confused with multi-channel. Whereas the latter represents a siloed, metric-driven approach to service, the former is inherently designed to provide a 360-degree customer view, supporting the foundation of true engagement. An omnichannel approach also supports businesses with the contextual and situational awareness needed to drive anticipatory engagement at the individual account level. This means knowing that a customer has been on your website for the last 15 minutes looking at a specific product of yours, which they inquired about during a live chat session with an agent two weeks ago. This kind of contextual data needs to be brought into the picture to add value and enhance the experience of whom you service, regardless of where the interaction first started.

Big data analytics: It’s imperative that you strategically use the contextual data within your organization to compete based on the CX. A huge part of next-generation IT involves seamlessly leveraging multiple databases and analytics capabilities to transform business outcomes (and ultimately, customers’ lives). This means finally breaking siloes to tap into the explosive amount of data—structured and unstructured, historical and real-time—at your disposal. Just as importantly, this means employees being able to openly share, track, and collect data across various teams, processes, and customer touch points. This level of data visibility means a hotel being able to see that a guest’s flight got delayed, enabling the on-duty manager to let that customer know that his or her reservation will be held. It means a bank being able to push out money management tips to a customer after seeing that the individual’s last five interactions were related to account spending.

These four components are critical to next-generation IT as part of a next-generation digital platform. Organizations must start looking at each of these components if they wish to compete based on the CX and respond at the speed of digital change. Stay tuned, next we’ll be talking about the ever-growing Internet of Things!

How to (Finally) Break the Longstanding Hold of Legacy Technology

Without question, we’ve seen more technological innovation in the last 30 years than we have in the last century. We now live in a reality of seemingly limitless possibilities and outcomes. Today, virtually any object can be considered part of an advanced, interconnected ecosystem. Companies across every sector are competing to reimagine customer engagement. The user experience is fundamentally changing as people, processes and services become more dynamically connected. Today’s smart, digital era represents unmatched opportunity for forward-thinking business leaders everywhere.

At the same time, however, it poses some challenges. Specifically, this rapid pace of innovation means businesses must find a way to quickly and efficiently modernize to competitively differentiate. In a time where digital disruptors are building custom IT environments on the fly, companies can no longer let legacy architecture dampen innovation and agility

Businesses know this all too well, with 90% of IT decision makers believing that legacy systems prevent them from harnessing the digital technologies they need to grow and thrive. This is especially true in industries like government and finance, where there’s still a heavy dependency on legacy technology. For example 71% of federal IT decision makers still use old operating systems to run important applications. Meanwhile, 30% of senior investment managers say they’re concerned about the ability of their current legacy systems to meet future regulatory requirements. This list goes on.

It’s clear that something needs to be done here, and fast. So, how exactly did we get to this point of digital disruption, and what can be done about legacy systems today? Let’s take a walk through recent history, and then discuss how companies can begin moving towards digital, next-generation IT.

Data Centralization to Decentralization

Let’s start where applications first began being consumed. About 30 to 40 years ago, all application intelligence was centralized (I’m sure some of you remember the good old mainframe days of using dumb terminals or emulators to access applications and store data centrally). There were some notable benefits to centralizing data in this fashion. There weren’t many issues with storage distribution, for instance, and disaster recovery procedures were clearly documented. Security challenges were also practically nonexistent because there wasn’t any local storage on the terminal (hence, dumb).

Soon, however, we saw the rise of the personal computer, which completely changed this model. Computing and storage could now be distributed, allowing local applications to run without any centralized dependency. This was a game-changer that sparked a desktop war between key market players like Microsoft (Windows), IBM (OS2), and Apple (MacOS).

This transition to decentralization, however, wasn’t without its challenges. Employees may have gained mobility, but IT began facing new challenges in security and distributed storage. Companies were left wondering how to best control their data storage, specifically where confidential information could easily be stored on a floppy disk, local hard drive and, later, USB drives. This remains a challenge to this day—no one wants to give up their mobility, so companies must find a way to instead regain control.

One thing to note: at this point, COTS (Commercial off-the-shelf) servers could now be used. These systems were far less proprietary than previous host systems like mainframes, VAX, etc. However, they were still hardware-dependent, as each platform was usually tailored to the applications it had to run. As a result, a good amount of compute, memory and storage resources were not being fully utilized. In fact, some services were running as low as only 10-20% capacity. While there were benefits to COTS servers, they called for a better way to maximize the use of all resources.

The Rise of Virtualization

The only viable solution to these problems was to eliminate hardware in favor of ONE single software application. But how? The market experienced profound change as companies strove to answer this question, eventually leading to the emergence of virtualization.

During this time, market leaders like VMware began transforming the industry by allowing multiple virtualized OS (virtual machines) to run simultaneously on the same hardware. In this way, applications ran as if they had their own dedicated compute, memory and storage. However, it was all being shared. Simply put, the hardware server had become virtualized. Brilliant!

This allowed companies to create virtual representations of resources such as compute, memory and storage devices. Companies could now run multiple applications over the same physical hardware, in a way that appeared to the applications as though they were running over their own dedicated hardware. More importantly, companies could now fully leverage every single resource at their disposal. Nothing would be left dormant or unused in this virtualized model, unlike what we saw in the past with a dedicated appliance/server per application.

At this point, it was a no brainer to move into the virtualized application world. However, the ugly truth remained: we were still using a legacy networking framework. Many continue to refer to this as client-server, but the bottom line is that it was a hierarchical model that required each node and link to be configured to carry or simulate end-to-end virtualization. Even though the application environment was virtualized, the infrastructure on which it ran was not built with that in mind. It didn’t matter if you were using VLANs, VRFs or even MPLS—it was a complex way of providing end-to-end virtualized services.

Who would finally be able to solve this issue? It seemed the Institute of Electrical and Electronics Engineers (IEEE) and Internet Engineering Task Force (IETF) were on the right track with the standardization of an Ethernet protocol that allows end-to-end services virtualization, which finally took place in May 2012. This is known as SPB, or Shortest Path Bridging (IEEE 802.1aq and IETF RFC 6329 for those interested). And there you have it: servers, applications and networks are now finally virtualized! Are we done? Well, not quite … even desktops are being virtualized, known as VDI (Virtual Desktop Infrastructure) to re-centralize control.

Overall, virtualization became the de facto model that allowed businesses to run applications on what we know as the Cloud. With private and public models, customers could now choose what assets they wanted to own (that is, manage on premises) or have hosted through the public cloud. Soon, however, the challenge became how to run apps in these clouds. Companies quickly discovered the need to store some applications (like regulatory and compliance data) in an onsite private cloud. Meanwhile, other data was best suited for the public cloud. This is how the hybrid cloud deployment model was born.

Cloud Elasticity

Hybrid cloud allowed companies to operate in an environment that strategically utilized the best of both worlds—both on-premises private cloud and third-party public cloud services—to meet their core objectives. In this new world of cloud orchestration, we saw the rise of digital giants like Amazon, Google and Facebook. With a high level of cloud elasticity, providers could now spin up series of virtual applications or services in less than an hour to run them in the public cloud. This unhinged the doors of opportunity for companies everywhere. These providers allowed organizations to create new instances on the fly and shut them down just as quickly. It’s used, for example, to soft launch new products or test drive business in new marketplaces.

But let’s not forget the issue that remains to this day: we have yet to completely move away from all aging hardware. In today’s world of any-to-any communication, driven by technologies like the IoT, artificial intelligence, and machine learning, legacy hardware and hierarchical networking architecture are not just an inconvenience. They can break your business if you don’t have a strategy to reduce that dependency.

Finally Breaking Free of Hardware

The bottom line is that any-to-any communications have won the battle (unlike 15 years ago, where IT largely resisted and essentially shut down the peer-to-peer model). As a result, what many refer to as “meshed communication architecture” emerged as the newest and strongest-yet approach to network design.

This kind of architecture is integrated, agile and future-proof enough to effectively and securely support a services-based ecosystem. The days of nodal configuration and virtualization are a thing of the past. It’s vital that companies move to this services-based architecture to be able to support the future of the customer experience. Consider how it’s essential for supporting smart cars that can autonomously park and change lanes, while being redirected to alternate routes because of traffic congestion. It’s critical for supporting smart home solutions that enable homeowners to remotely manage utility usage. It’s crucial for delivering the most value possible to those who matter most: end-users.

For decades, we’ve been trying to eliminate a primal dependency on hardware. To finally break the silos associated with hardware, companies must begin setting themselves up to support any-to-any communication. In this environment, all services can virtually run anywhere across multiple sources of hardware that can be geographically dispersed.

Now that we know what can be done about legacy systems (transition to an open, software-enabled, meshed architecture), let’s discuss how companies can successfully integrate digital into their existing environment to transform business. Stay tuned for more.