How to (Finally) Break the Longstanding Hold of Legacy Technology

Without question, we’ve seen more technological innovation in the last 30 years than we have in the last century. We now live in a reality of seemingly limitless possibilities and outcomes. Today, virtually any object can be considered part of an advanced, interconnected ecosystem. Companies across every sector are competing to reimagine customer engagement. The user experience is fundamentally changing as people, processes and services become more dynamically connected. Today’s smart, digital era represents unmatched opportunity for forward-thinking business leaders everywhere.

At the same time, however, it poses some challenges. Specifically, this rapid pace of innovation means businesses must find a way to quickly and efficiently modernize to competitively differentiate. In a time where digital disruptors are building custom IT environments on the fly, companies can no longer let legacy architecture dampen innovation and agility

Businesses know this all too well, with 90% of IT decision makers believing that legacy systems prevent them from harnessing the digital technologies they need to grow and thrive. This is especially true in industries like government and finance, where there’s still a heavy dependency on legacy technology. For example 71% of federal IT decision makers still use old operating systems to run important applications. Meanwhile, 30% of senior investment managers say they’re concerned about the ability of their current legacy systems to meet future regulatory requirements. This list goes on.

It’s clear that something needs to be done here, and fast. So, how exactly did we get to this point of digital disruption, and what can be done about legacy systems today? Let’s take a walk through recent history, and then discuss how companies can begin moving towards digital, next-generation IT.

Data Centralization to Decentralization

Let’s start where applications first began being consumed. About 30 to 40 years ago, all application intelligence was centralized (I’m sure some of you remember the good old mainframe days of using dumb terminals or emulators to access applications and store data centrally). There were some notable benefits to centralizing data in this fashion. There weren’t many issues with storage distribution, for instance, and disaster recovery procedures were clearly documented. Security challenges were also practically nonexistent because there wasn’t any local storage on the terminal (hence, dumb).

Soon, however, we saw the rise of the personal computer, which completely changed this model. Computing and storage could now be distributed, allowing local applications to run without any centralized dependency. This was a game-changer that sparked a desktop war between key market players like Microsoft (Windows), IBM (OS2), and Apple (MacOS).

This transition to decentralization, however, wasn’t without its challenges. Employees may have gained mobility, but IT began facing new challenges in security and distributed storage. Companies were left wondering how to best control their data storage, specifically where confidential information could easily be stored on a floppy disk, local hard drive and, later, USB drives. This remains a challenge to this day—no one wants to give up their mobility, so companies must find a way to instead regain control.

One thing to note: at this point, COTS (Commercial off-the-shelf) servers could now be used. These systems were far less proprietary than previous host systems like mainframes, VAX, etc. However, they were still hardware-dependent, as each platform was usually tailored to the applications it had to run. As a result, a good amount of compute, memory and storage resources were not being fully utilized. In fact, some services were running as low as only 10-20% capacity. While there were benefits to COTS servers, they called for a better way to maximize the use of all resources.

The Rise of Virtualization

The only viable solution to these problems was to eliminate hardware in favor of ONE single software application. But how? The market experienced profound change as companies strove to answer this question, eventually leading to the emergence of virtualization.

During this time, market leaders like VMware began transforming the industry by allowing multiple virtualized OS (virtual machines) to run simultaneously on the same hardware. In this way, applications ran as if they had their own dedicated compute, memory and storage. However, it was all being shared. Simply put, the hardware server had become virtualized. Brilliant!

This allowed companies to create virtual representations of resources such as compute, memory and storage devices. Companies could now run multiple applications over the same physical hardware, in a way that appeared to the applications as though they were running over their own dedicated hardware. More importantly, companies could now fully leverage every single resource at their disposal. Nothing would be left dormant or unused in this virtualized model, unlike what we saw in the past with a dedicated appliance/server per application.

At this point, it was a no brainer to move into the virtualized application world. However, the ugly truth remained: we were still using a legacy networking framework. Many continue to refer to this as client-server, but the bottom line is that it was a hierarchical model that required each node and link to be configured to carry or simulate end-to-end virtualization. Even though the application environment was virtualized, the infrastructure on which it ran was not built with that in mind. It didn’t matter if you were using VLANs, VRFs or even MPLS—it was a complex way of providing end-to-end virtualized services.

Who would finally be able to solve this issue? It seemed the Institute of Electrical and Electronics Engineers (IEEE) and Internet Engineering Task Force (IETF) were on the right track with the standardization of an Ethernet protocol that allows end-to-end services virtualization, which finally took place in May 2012. This is known as SPB, or Shortest Path Bridging (IEEE 802.1aq and IETF RFC 6329 for those interested). And there you have it: servers, applications and networks are now finally virtualized! Are we done? Well, not quite … even desktops are being virtualized, known as VDI (Virtual Desktop Infrastructure) to re-centralize control.

Overall, virtualization became the de facto model that allowed businesses to run applications on what we know as the Cloud. With private and public models, customers could now choose what assets they wanted to own (that is, manage on premises) or have hosted through the public cloud. Soon, however, the challenge became how to run apps in these clouds. Companies quickly discovered the need to store some applications (like regulatory and compliance data) in an onsite private cloud. Meanwhile, other data was best suited for the public cloud. This is how the hybrid cloud deployment model was born.

Cloud Elasticity

Hybrid cloud allowed companies to operate in an environment that strategically utilized the best of both worlds—both on-premises private cloud and third-party public cloud services—to meet their core objectives. In this new world of cloud orchestration, we saw the rise of digital giants like Amazon, Google and Facebook. With a high level of cloud elasticity, providers could now spin up series of virtual applications or services in less than an hour to run them in the public cloud. This unhinged the doors of opportunity for companies everywhere. These providers allowed organizations to create new instances on the fly and shut them down just as quickly. It’s used, for example, to soft launch new products or test drive business in new marketplaces.

But let’s not forget the issue that remains to this day: we have yet to completely move away from all aging hardware. In today’s world of any-to-any communication, driven by technologies like the IoT, artificial intelligence, and machine learning, legacy hardware and hierarchical networking architecture are not just an inconvenience. They can break your business if you don’t have a strategy to reduce that dependency.

Finally Breaking Free of Hardware

The bottom line is that any-to-any communications have won the battle (unlike 15 years ago, where IT largely resisted and essentially shut down the peer-to-peer model). As a result, what many refer to as “meshed communication architecture” emerged as the newest and strongest-yet approach to network design.

This kind of architecture is integrated, agile and future-proof enough to effectively and securely support a services-based ecosystem. The days of nodal configuration and virtualization are a thing of the past. It’s vital that companies move to this services-based architecture to be able to support the future of the customer experience. Consider how it’s essential for supporting smart cars that can autonomously park and change lanes, while being redirected to alternate routes because of traffic congestion. It’s critical for supporting smart home solutions that enable homeowners to remotely manage utility usage. It’s crucial for delivering the most value possible to those who matter most: end-users.

For decades, we’ve been trying to eliminate a primal dependency on hardware. To finally break the silos associated with hardware, companies must begin setting themselves up to support any-to-any communication. In this environment, all services can virtually run anywhere across multiple sources of hardware that can be geographically dispersed.

Now that we know what can be done about legacy systems (transition to an open, software-enabled, meshed architecture), let’s discuss how companies can successfully integrate digital into their existing environment to transform business. Stay tuned for more.

Related Articles:

3 CX Stats That May Change How You Think About Digital Transformation

Technologies like Artificial Intelligence, automation, big data, and the Internet of Things have made digital transformation an absolute necessity for organizations. With people, processes, services and things more dynamically connected than ever, companies are feeling relentless pressure to digitize, simplify, and integrate their organizational structures to remain competitive.

But there’s a big hole in the fabric of most digital transformation (DX) plans: the customer experience (CX). The problem isn’t that companies fail to understand the importance of the CX in relation to digital transformation. Rather, most fail to understand their customers well enough to envision a truly customer-centric, digitally-transformed environment. Just consider that 55% of companies cite “evolving customer behaviors and preferences” as their primary driver of digital change. Yet, the number one challenge facing executives today is understanding customer behavior and impact.

A massive part of digital transformation involves building a CX strategy, and yet customer centricity remains a top challenge for most. In fact, I encourage you to be your own customer within your organization. Walk in your customers’ shoes, contact your organization as your customers would. What was your web experience? Was the expert knowledgeable during a chat conversation? How well did the mobile app work for you? Did you have a connected experience? Given your experience, how brand-loyal would you be to your organization?

Here are three statistics that will get you rethinking your CX strategy in relation to digital transformation:

  1. 52% of companies don’t share customer intelligence outside of the contact center. In other words, over half of companies are limiting the customer journey to the contact center even though it naturally takes place across multiple key areas of business (i.e., sales, marketing, HR, billing). Businesses must ensure customers are placed with the right resource at the right time, whether it’s in a contact center or non-contact center environment. The key is being able to openly share customer data across all teams, processes and customer touchpoints.
  2. 60% of digital analytics investments will be spent on customer journey analytics by 2018. Customer journey analytics—the process of measuring the end-to-end customer journey across the entire organization—is critical in today’s smart, digital world. Companies are rapidly investing in this area to identify opportunities for process improvement, digitization, automation and, ultimately, competitive differentiation.
  3. 60% of customers change their contact channel depending on where they are and what they’re doing. This means organizations must focus less on service and more on contextual and situational awareness. Businesses must work to create a seamless experience—regardless of device, channel, location or time—supported by customer, business and situational context captured across all touchpoints.

The CX should influence every company’s digital transformation story. For more tips, insights, and impactful statistics check out our eBook, Fundamentals of Digital Transformation. Let me know what you think. We look forward to hearing from you.

What It Takes to Be a Technology Leader in an Evolving Digital World

The definition of a leader varies greatly, especially in business. From my perspective a leader is defined by their ability to pivot and adapt to the evolution of a market. Like many companies today, Avaya, its customers and partners are riding the often daunting—yet consistently exciting—wave of digital transformation of the enterprise. As a technology leader, Avaya is not only pivoting and adapting to this new environment for itself, but pivoting and adapting our services and solutions to enable its customers and partners to thrive during their own transformations.

Unlike many of our competitors, digital transformation is something we saw coming years ago. We recognized right away that it wasn’t a passing fad but something that could truly transform how business gets done, with communications playing the most important role. We knew that for us to be successful, we would need to focus on transforming ourselves first so that our customers and partners could learn and benefit from our experiences, our lessons learned. During our own transformation, we gained that extra insight that we were able to leverage in the development of truly transformational solutions and services.

As we drove our own multi-year transformation, we also maintained our global market share leadership position in Contact Center. According to Canalys research, we hold more than 34% of the market, which is almost greater than the No. 2 and No. 3 competitors combined. No technology leader gets to claim this size of market share without making its customers a priority.

Last month for example, we hosted a private event in New York as part of our Future of Communication Experience series. The purpose was to update and inform specially invited customers about our portfolio roadmap and vision. We encouraged them to come with questions and to be prepared to have real, in-depth conversations about the challenges they’re facing during their own transformations. As always, it was a great experience for the customers and Avayans in attendance. Overall, customers from world-leading payment brands, to high-end retail chains, to players in the automotive industry said that they are very optimistic, confident and excited about what we have to offer today and what we have planned for the future. And next month we will be in Mexico City for our twelfth consecutive year with 3,000 Avaya customers and partners from all over the Americas. This is the largest customer and partner event we do all year.

In particular, two of the solutions our customers are most excited about are Avaya Oceana™ omnichannel contact center and Avaya Breeze™ development platform. These same solutions were recently touted as visionary by a global analyst firm as part of its latest Magic Quadrant ranking.

Avaya Oceana, which was launched last year, adds advanced multi-channel functionality to our own contact center solutions, such as Avaya Aura® Call Center Elite voice platform and Avaya Aura® Contact Center. It also integrates with third-party automatic call distribution solutions, as well as offering advanced reporting and customer journey mapping capabilities through Avaya Oceanalytics™ insights. Specifically, we have been told by analysts that Oceana’s new approach to routing—which is attribute matching so that it includes data consideration and customer journey mapping—is something our competitors simply can’t offer.

The Avaya Breeze platform, which Avaya Oceana was built upon, enables users to be flexible when responding to the ever-evolving digital marketplace. It has garnered industry recognition for its ability to enable developers to quickly create unique communications-enabled contact center applications and workflows for within and beyond the enterprise—with little or no development required and nearly instant deployment. We are seeing customers use Avaya Breeze to create unique applications tailored to their specific business and communications needs.

According to Irwin Lazar, Vice President and Service Director at Nemertes Research, “More than half of the companies are using or planning to use APIs to embed communications capabilities into their apps, while another 25% are looking at using them to build custom apps. Platforms like Avaya Breeze offer organizations the ability to deliver enhanced business value and execute on their digital strategies by integrating communications and collaboration into workflows, business processes and existing applications.”

These solutions are just the tip of the iceberg for Avaya. We are a long-standing industry standard with a significant global footprint. We are focused on continuing to expand and develop our solutions to meet the needs of our growing global customer base, with more than 5,400 patents awarded and pending, including foreign counterparts.

Our strong service provider and system integration partnerships around the world enable us to meet the needs of a wide variety of organizations, both large and midsize. We’ve received industry recognition for our strong Contact Center integration solutions.

Our continued strength in the industry is evident by our 300,000 customers worldwide. In fact, the top 10 largest banks worldwide are running Avaya solutions and 90% of Fortune 100 companies are Avaya customers.

At Avaya, we are re-imagining the industry, bringing visionary products and solutions to market, and enabling our customers to digitally transform their businesses with ease. I am excited and proud of our ability to continue to evolve, pivot and adapt to the changing business communications world. After all, that is the responsibility of a leader.

Let’s Talk about the Modern Business Ecosystem: Why We Need to Open Up

Forty years ago, technology vendors had it all figured out. They would differentiate themselves by continually bringing new proprietary solutions to market—a recipe for success in an age of a closed hardware dependent architecture. By exclusively building their own product portfolio under patent or trade-secret protection, companies could easily secure long-term revenue. This proprietary race fueled business for decades, and it still does today. Consider proprietary software solutions from Apple, which have licensing terms that limit usage to only Apple hardware (for example, Mac OS X).

A proprietary model offers several perks, yet not enough in today’s era of digital transformation. Intelligent, connected technologies like IoT, AI and machine learning have ushered enterprises into a new era of any-to-any communication, one filled with seemingly limitless collaboration and CX possibilities. As companies worked to keep up with the rapid pace of innovation, they came to realize that proprietary solutions stifled their efforts to grow and evolve, and they could no longer rely on one or multiple vendor or their life cycle timelines to develop the next-gen CX and/or vertical-specific services they needed.

A Big Change in a Small Amount of Time

Over the course of just a few short years, we saw a massive paradigm shift in which companies began seeking niche vendors to drive revenue and competitiveness. They turned to cloud-based businesses that were born in the digital era. They looked to startups that specialized in vertical-specific strategies. It wasn’t long before the average organization had created a unique, multi-vendor ecosystem in which various solutions were integrated to meet specific customer and vertical requirements. Case in point: the average business now leverages up to six different cloud solutions.

As every market filled with competing vendors, it seemed the most influential players were those that offered engaged, open ecosystems. These vendors allowed customers to freely modify original source code for virtually any purpose, versus retaining copyrights. With so many companies operating complex, multi-vendor ecosystems, open architecture that enabled collaborative app development became ideal for driving desired customer outcomes. We even see customers now acquire their own technology to accelerate the digitization of their business. You can’t do that in a proprietary and rigid architecture.

Multi-vendor Ecosystem vs. Open Ecosystem

This rise of niche vendors isn’t expected to slow down anytime soon. In fact, Gartner predicts that startups will overtake leaders like Amazon, Google, IBM and Microsoft in markets like AI by 2019. If not properly supported, however, a multi-vendor environment can create infinitely more harm than good.

For starters, companies must secure their multi-vendor ecosystems. Research shows that the average organization’s network is accessed by 89 different vendors and partners per week, a number that should send chills down your spine from a security perspective. If that’s not shocking enough, one-third of companies admit they don’t know how many vendors access their systems at any given time. Despite this, over 70% believe their number of third-party vendors will increase by 2018.

In addition to this is the inherent challenge of seamlessly leveraging multiple different vendor solutions. You see, if these solutions aren’t properly integrated, they don’t represent a truly open ecosystem. To build targeted solutions that continually improve outcomes, companies must be able to seamlessly collect, track, share and use the data that exists across all vendor platforms and knowledge bases. None of these systems can be siloed from one another.

Consider the benefits of an open ecosystem within the transportation industry. Picture this scenario: administrators have taken notice that the 7:45 a.m. train fills up every morning to the point where passengers must wait for the next train. In a truly open ecosystem, management can leverage data collected across various integrated solutions (i.e., ticketing platforms, video surveillance systems, Wi-Fi/carrier grade services, mobile app systems, movement sensors, etc.) to identify the root cause of the issue and begin driving better customer outcomes. Data from the ticketing platform, for instance, may show that tickets purchased for 7:45 a.m. exceed the train’s maximum capacity by 15%.

At this point, management can leverage data in various ways to determine the best solution to the problem. For example, they may want to build a sophisticated level of automation to dynamically change the train schedule, monitoring it for continual improvement. They may choose to send automated SMS messages informing customers of anticipated congestion times and suggested alternatives for work travel while displaying updated information in real time on their digital signage systems. They could incentivize daily commuters by offering 15% off monthly passes if used for an earlier or later train time. Regardless of how the experience is enhanced, the entire technology ecosystem should be actively working together to make it happen. As I say, dealing with congestions on highways by constantly rebuilding the roads with more lanes is not exactly the smartest approach. Maximizing and optimizing its usage through smart traffic distribution and management can be proven to be way more effective while meeting the citizen’s experience.

The Future of the Customer Experience Relies on Open, Extensible Architecture

The more open a business ecosystem, the more seamlessly data can be leveraged to drive desired customer and citizen outcomes. The ability to track, collect and share data across dispersed systems is what allows companies to create custom solutions that target exact customer requirements. This open, extensible nature is vital within a next-generation platform.

Differentiating oneself is no longer as simple as rolling out a new proprietary solution. To drive desired outcomes and deliver true value, organizations must be open, agile, integrated and future proof. As the world continues transitioning to an open ecosystem, we become that much closer to eliminating a longstanding dependency on legacy hardware and hierarchal architecture.

So far, I’ve discussed four of five critical components that organizations must start looking at within a next-generation platform: next-gen IT, IoT, AI and open ecosystem. Up next, we’ll take a deep dive into the final and most significant of these: the customer (or citizens) experience. Stay tuned.