Keep It Real, Open, and in Real Time!

As the Cloud gains momentum in 2014, so has crowdsourcing, one of seven 2014 emerging trends that we identified ten months ago. Thanks to gamification, a term so aptly named by two Wired writers back in 2005, crowdsourcing has continued to surge in 2014. Over the last year for example, nearly 90% of all Avaya customer questions started on the Web, where customers have numerous ways to resolve issues – voice over the web, chat, video conference, and increasingly, support forums.

This week, we wanted to share insights from a leader of a talented team dedicated to helping guide crowdsourcing in the fast growing Avaya Support Forums community that is more than 25,000 strong!

Pat Patterson

Let the Game Begin! The 5 Principles of Moderating, Monitoring and Medaling

By Russell Brookes, Avaya Global Support Services

Moderating Monitoring Medaling v2_hp-1

Are you facing a vexing problem and need a quick response? Are you looking for other users with similar solution configurations? These are just a few of the endless opportunities to crowdsource a solution to questions by collaborating with other skilled communications and IT professionals around the world through the recently upgraded Avaya Support Forums. It’s a way for talented and experienced users to build their professional networks and their reputations in the industry.

Avaya Support Forums, available in English, Chinese, Japanese, Portuguese, and Spanish, and publicly indexed by Google, have gained tremendous traffic as end users weigh in on topics spanning 16 forums, including:

  • IP Telephony and Convergence
  • Unified Communications
  • Contact Center Applications
  • Messaging

The forums have worked because of five guiding principles:

  • Keep it real, open, and in real time! Sanitizing too much will dry up forums. Negative and positive views should be posted in real-time. One thing that damages a community is posturing that everything is rosy. Leveraging a heavy handed approach to sanitize can jeopardize the credibility of discussions and the integrity of all who contribute.
  • By the members..for the members! An active forum is one where the end user community is able to quickly engage and support fellow users whom they have never met, but have been bonded through common challenges and opportunities.
  • Easy does it! A moderator team, with a light editing touch, are committed to ensuring that discussions continue forward to the benefit of all. “Appropriateness” is the key word.
  • With status comes privilege! As contributors add more value they should be able to “level up” and gain new privileges, for example:
  • Templates for autosignatures
  • Calendar event posting
  • Upload animated profile pictures
  • Pictures in signatures
  • Manage their visitor messages
  • Post polls
  • Email members and use friends lists
  • Create and manage their own social groups and discussion groups
  • Upload social group icons
  • “Let the light shine on!” The best and brightest come out to be part of forums, providing help for those in need. As these forums continue to experience double digit growth, the forums are a place for people to shine and be sought after.

These five guiding principles have been the mantra that has fueled the growth of the Avaya community that involves customers, partners and employees alike to tackle questions from near and far.

Using these principles, Leaders in the Avaya forum have emerged. The community doles out “reputation points” as their form of applause. Based on the reputation points shields are awarded to users. Users can move through eight Forum shield ranks:

  • Aspiring Member
  • Member
  • Hot Shot
  • Whiz
  • Brainiac
  • Guru
  • Genius
  • Legend

Several customers have risen through the ranks to achieve Guru status with one reaching Genius and Legend statuses.

Using the five principles above, Avaya has grown the Forum to more than 25,000 passionate contributors. Just as new applications and technologies will lead to adoption and boundless curiosity, so will user interest and undoubtedly, a surging community just waiting to provide answers!

When was the last time that you got help online?

When was the last time that you “paid it forward” with an answer to an online forum?

Related Articles:

Continuous Learning: Propelling Forward in a Rapidly and Inevitably Changing World

Whether we realize it or not, advanced technologies like artificial intelligence (AI), augmented reality, and the Internet of Things (IoT) have transformed the way we think about the world around us. From how we protect our schools to the way we navigate our streets to how we shop for groceries, such technology now lies at the heart of practically everything we do today.

Just as these technologies have changed the way we live, they have changed the way we work. Today’s rapid pace of innovation has transformed nearly every business task, process, and workflow imaginable—so much so that industry analysts estimate that up to 45% of activities that employees are paid to perform can now be automated.

This digital disruption—or what many are calling the Fourth Industrial Revolution—without question redefines traditional roles and responsibilities. In fact, research shows that in five years, more than one third of skills that are considered important in today’s workforce will have changed. Even more, analysts estimate that 65% of children today will grow up to work in roles that don’t yet exist.

While we do still see employees that specialize in one skill or expertise, we’ve mostly moved away from the days of hiring an employee for just one job. As technology evolves, so too do the skills required to innovate and propel forward. Looking ahead, employees must have a propensity for continuous learning and adopting new skills to be able to recognize and respond to today’s speed of digital change.

Consider how technology has changed the marketing paradigm. As recently as 10 years ago, marketing platforms like Marketo and HubSpot had only just been founded, Facebook was still in its infancy, and the first iPhone had newly hit the market. As technologies like cloud, social, mobile and big data evolved, however, we suddenly began seeing new tools specifically designed to enhance digital media, social media marketing, and mobile marketing. As a result, companies began searching to fill roles for social media coordinators, digital campaign managers and integrated marketing planners—jobs that were unfathomable 15 to 20 years prior.

Fast forward to today and we’re seeing the emergence of new technology for marketing, such as augmented reality, geofencing, and emotion detection. The continual emergence of new technology perpetually creates skills gaps that must be filled by employees who are passionate, motivated, and invested in their own learning. These kinds of team members are committed to developing new skills and leveraging their strengths to outperform.

But not all employees can easily identify their strengths or develop new skills. This is likely why nearly half of employees today feel unengaged at work, with nearly 20% feeling “actively disengaged.” At the same time, companies are struggling to align employee strengths with organizational priorities. Employees may have certain strengths, but employers may find those skills don’t directly increase operational efficiency or performance. This is why nearly 80% of businesses are more worried about a talent shortage today than they were two years ago.

So, what’s the answer? Employees and employers must work together to identify what roles are currently filled, what skills are still needed, and who best exemplifies those skills. For employees, this means taking control of how they grow their careers and improving for the better. For employers, this means displaying an unwavering commitment to employee reinvestment by understanding key areas of interest to effectively fill skills gaps.

At Avaya, for example, we’re leading an employee enablement program under our Marketing 3.0 strategy. The initiative is designed to help strengthen our marketing organization by equipping employees with the right competencies that reflect our culture, strategy, expectations and market dynamics. By doing so, we can ensure we’re recruiting and managing talent in the most strategic way, putting the right people in the right jobs with the abilities to perform at maximum potential every day. By having each marketing function participate in a simple knowledge profile exercise, we can begin objectively determining development opportunities that best meet their needs and the needs of our business.

As technology continuously evolves, it’s crucial that employees have a propensity for continuous learning and that organizations foster an environment for this learning. In the words of former GE CEO Jack Welch, “An organization’s ability to learn, and translate that learning into action rapidly, is the ultimate competitive advantage.”

We live in a world that is rapidly and inevitably changing. Employees should embrace this change to thrive, and must if they wish to propel business forward. As employers, we are responsible for strategically leveraging our resources to align employee strengths with organizational needs to succeed in this environment of constant change.

Next-Generation IT: What Does It Really Look Like?

From mainframes to virtualization to the IoT, we’ve come a long way in a very short amount of time in terms of networking, OS and applications. All this progress has led us to an inflection point of digital business innovation; a critical time in history where, as Gartner puts it best, enterprises must “recognize, prioritize and respond at the speed of digital change.” Despite this, however, many businesses still rely on legacy systems that prevent them from growing and thriving. So, what’s the deal?

I attempted to answer this in a previous blog, where I laid out as entirely as I could the evolution of interconnectivity leading up to today. What was ultimately concluded in that blog is that we have reached a point where we can finally eliminate dependency on legacy hardware and hierarchical architecture with the use of one single, next-generation software platform. The call for organizations across all industries to migrate from legacy hardware has never been stronger, and the good news is that technology has evolved to a point where they can now effectively do so.

This concept of a “next-generation platform,” however, isn’t as simple as it sounds. Just consider its many variations among industry analysts. McKinsey & Company, for example, refers to this kind of platform as “next-generation infrastructure” (NGI). Gartner, meanwhile, describes it as the “New Digital Platform.” We’re seeing market leaders emphasizing the importance of investing in a next-generation platform, yet many businesses still wonder what the technology actually looks like.

To help make it clearer, Avaya took a comparative look at top analyst definitions and broke them down into five key areas of focus for businesses industry-wide: 

  1. Next-generation IT
  2. The Internet of Things (IoT)
  3. Artificial intelligence (AI)/automation
  4. Open ecosystem
  5. The customer/citizens experience

In a series of upcoming blogs, I’ll be walking through these five pillars of a next-generation platform, outlining what they mean and how they affect businesses across every sector. So, let’s get started with the first of these: next-generation IT.

Simplifying Next-Gen IT

As IT leaders face unrelenting pressure to elevate their infrastructure, next-generation IT has emerged as a way to enable advanced new capabilities and support ever-growing business needs. But what does it consist of? Well, many things. The way we see it, however, next-generation IT is defined by four core elements: secure mobility, any-cloud deployment (more software), omnichannel and big data analytics—all of which are supported by a next-generation platform built on open communications architecture.

Secure mobility: Most digital growth today stems from mobile usage. Just consider that mobile now represents 65% of all digital media time, with the majority of traffic for over 75% of digital content—health information, news, retail, sports—coming from mobile devices. Without question, the ability to deliver a secure mobile customer/citizen experience must be part of every organizational DNA. This means enabling customers to securely consume mobile services anytime, anywhere and however desired with no physical connectivity limitations. Whether they’re on a corporate campus connected to a dedicated WLAN, at Starbucks connected to a Wi-Fi hotspot, or on the road paired to a Bluetooth device though cellular connectivity, the connection must always be seamless and secure. Businesses must start intelligently combining carrier wireless technology with next-generation Wi-Fi infrastructure to make service consumption more secure and mobile-minded with seamless hand-off between the two technologies.

Any-cloud deployment: Consumers should be able to seamlessly deploy any application or service as part of any cloud deployment model (hybrid, public or private). To enable this, businesses must sufficiently meet today’s requirements for any-to-any communication. As I discussed in my previous blog, the days of nodal configuration and virtualization are a thing of the past; any-to-any communications have won the battle. A next-generation platform built on open communications architecture is integrated, agile, and future-proof enough to effectively and securely support a services-based ecosystem. Of course, the transition towards software services is highly desirable but remember not all hardware will disappear—although where possible it should definitely be considered. This services-based design is the underlying force of many of today’s greatest digital developments (smart cars, smart cities). It’s what allows organizations across every sector to deliver the most value possible to end-users.

Omnichannel: All communication and/or collaboration platforms must be omnichannel enabled. This is not to be confused with multi-channel. Whereas the latter represents a siloed, metric-driven approach to service, the former is inherently designed to provide a 360-degree customer view, supporting the foundation of true engagement. An omnichannel approach also supports businesses with the contextual and situational awareness needed to drive anticipatory engagement at the individual account level. This means knowing that a customer has been on your website for the last 15 minutes looking at a specific product of yours, which they inquired about during a live chat session with an agent two weeks ago. This kind of contextual data needs to be brought into the picture to add value and enhance the experience of whom you service, regardless of where the interaction first started.

Big data analytics: It’s imperative that you strategically use the contextual data within your organization to compete based on the CX. A huge part of next-generation IT involves seamlessly leveraging multiple databases and analytics capabilities to transform business outcomes (and ultimately, customers’ lives). This means finally breaking siloes to tap into the explosive amount of data—structured and unstructured, historical and real-time—at your disposal. Just as importantly, this means employees being able to openly share, track, and collect data across various teams, processes, and customer touch points. This level of data visibility means a hotel being able to see that a guest’s flight got delayed, enabling the on-duty manager to let that customer know that his or her reservation will be held. It means a bank being able to push out money management tips to a customer after seeing that the individual’s last five interactions were related to account spending.

These four components are critical to next-generation IT as part of a next-generation digital platform. Organizations must start looking at each of these components if they wish to compete based on the CX and respond at the speed of digital change. Stay tuned, next we’ll be talking about the ever-growing Internet of Things!

How to (Finally) Break the Longstanding Hold of Legacy Technology

Without question, we’ve seen more technological innovation in the last 30 years than we have in the last century. We now live in a reality of seemingly limitless possibilities and outcomes. Today, virtually any object can be considered part of an advanced, interconnected ecosystem. Companies across every sector are competing to reimagine customer engagement. The user experience is fundamentally changing as people, processes and services become more dynamically connected. Today’s smart, digital era represents unmatched opportunity for forward-thinking business leaders everywhere.

At the same time, however, it poses some challenges. Specifically, this rapid pace of innovation means businesses must find a way to quickly and efficiently modernize to competitively differentiate. In a time where digital disruptors are building custom IT environments on the fly, companies can no longer let legacy architecture dampen innovation and agility

Businesses know this all too well, with 90% of IT decision makers believing that legacy systems prevent them from harnessing the digital technologies they need to grow and thrive. This is especially true in industries like government and finance, where there’s still a heavy dependency on legacy technology. For example 71% of federal IT decision makers still use old operating systems to run important applications. Meanwhile, 30% of senior investment managers say they’re concerned about the ability of their current legacy systems to meet future regulatory requirements. This list goes on.

It’s clear that something needs to be done here, and fast. So, how exactly did we get to this point of digital disruption, and what can be done about legacy systems today? Let’s take a walk through recent history, and then discuss how companies can begin moving towards digital, next-generation IT.

Data Centralization to Decentralization

Let’s start where applications first began being consumed. About 30 to 40 years ago, all application intelligence was centralized (I’m sure some of you remember the good old mainframe days of using dumb terminals or emulators to access applications and store data centrally). There were some notable benefits to centralizing data in this fashion. There weren’t many issues with storage distribution, for instance, and disaster recovery procedures were clearly documented. Security challenges were also practically nonexistent because there wasn’t any local storage on the terminal (hence, dumb).

Soon, however, we saw the rise of the personal computer, which completely changed this model. Computing and storage could now be distributed, allowing local applications to run without any centralized dependency. This was a game-changer that sparked a desktop war between key market players like Microsoft (Windows), IBM (OS2), and Apple (MacOS).

This transition to decentralization, however, wasn’t without its challenges. Employees may have gained mobility, but IT began facing new challenges in security and distributed storage. Companies were left wondering how to best control their data storage, specifically where confidential information could easily be stored on a floppy disk, local hard drive and, later, USB drives. This remains a challenge to this day—no one wants to give up their mobility, so companies must find a way to instead regain control.

One thing to note: at this point, COTS (Commercial off-the-shelf) servers could now be used. These systems were far less proprietary than previous host systems like mainframes, VAX, etc. However, they were still hardware-dependent, as each platform was usually tailored to the applications it had to run. As a result, a good amount of compute, memory and storage resources were not being fully utilized. In fact, some services were running as low as only 10-20% capacity. While there were benefits to COTS servers, they called for a better way to maximize the use of all resources.

The Rise of Virtualization

The only viable solution to these problems was to eliminate hardware in favor of ONE single software application. But how? The market experienced profound change as companies strove to answer this question, eventually leading to the emergence of virtualization.

During this time, market leaders like VMware began transforming the industry by allowing multiple virtualized OS (virtual machines) to run simultaneously on the same hardware. In this way, applications ran as if they had their own dedicated compute, memory and storage. However, it was all being shared. Simply put, the hardware server had become virtualized. Brilliant!

This allowed companies to create virtual representations of resources such as compute, memory and storage devices. Companies could now run multiple applications over the same physical hardware, in a way that appeared to the applications as though they were running over their own dedicated hardware. More importantly, companies could now fully leverage every single resource at their disposal. Nothing would be left dormant or unused in this virtualized model, unlike what we saw in the past with a dedicated appliance/server per application.

At this point, it was a no brainer to move into the virtualized application world. However, the ugly truth remained: we were still using a legacy networking framework. Many continue to refer to this as client-server, but the bottom line is that it was a hierarchical model that required each node and link to be configured to carry or simulate end-to-end virtualization. Even though the application environment was virtualized, the infrastructure on which it ran was not built with that in mind. It didn’t matter if you were using VLANs, VRFs or even MPLS—it was a complex way of providing end-to-end virtualized services.

Who would finally be able to solve this issue? It seemed the Institute of Electrical and Electronics Engineers (IEEE) and Internet Engineering Task Force (IETF) were on the right track with the standardization of an Ethernet protocol that allows end-to-end services virtualization, which finally took place in May 2012. This is known as SPB, or Shortest Path Bridging (IEEE 802.1aq and IETF RFC 6329 for those interested). And there you have it: servers, applications and networks are now finally virtualized! Are we done? Well, not quite … even desktops are being virtualized, known as VDI (Virtual Desktop Infrastructure) to re-centralize control.

Overall, virtualization became the de facto model that allowed businesses to run applications on what we know as the Cloud. With private and public models, customers could now choose what assets they wanted to own (that is, manage on premises) or have hosted through the public cloud. Soon, however, the challenge became how to run apps in these clouds. Companies quickly discovered the need to store some applications (like regulatory and compliance data) in an onsite private cloud. Meanwhile, other data was best suited for the public cloud. This is how the hybrid cloud deployment model was born.

Cloud Elasticity

Hybrid cloud allowed companies to operate in an environment that strategically utilized the best of both worlds—both on-premises private cloud and third-party public cloud services—to meet their core objectives. In this new world of cloud orchestration, we saw the rise of digital giants like Amazon, Google and Facebook. With a high level of cloud elasticity, providers could now spin up series of virtual applications or services in less than an hour to run them in the public cloud. This unhinged the doors of opportunity for companies everywhere. These providers allowed organizations to create new instances on the fly and shut them down just as quickly. It’s used, for example, to soft launch new products or test drive business in new marketplaces.

But let’s not forget the issue that remains to this day: we have yet to completely move away from all aging hardware. In today’s world of any-to-any communication, driven by technologies like the IoT, artificial intelligence, and machine learning, legacy hardware and hierarchical networking architecture are not just an inconvenience. They can break your business if you don’t have a strategy to reduce that dependency.

Finally Breaking Free of Hardware

The bottom line is that any-to-any communications have won the battle (unlike 15 years ago, where IT largely resisted and essentially shut down the peer-to-peer model). As a result, what many refer to as “meshed communication architecture” emerged as the newest and strongest-yet approach to network design.

This kind of architecture is integrated, agile and future-proof enough to effectively and securely support a services-based ecosystem. The days of nodal configuration and virtualization are a thing of the past. It’s vital that companies move to this services-based architecture to be able to support the future of the customer experience. Consider how it’s essential for supporting smart cars that can autonomously park and change lanes, while being redirected to alternate routes because of traffic congestion. It’s critical for supporting smart home solutions that enable homeowners to remotely manage utility usage. It’s crucial for delivering the most value possible to those who matter most: end-users.

For decades, we’ve been trying to eliminate a primal dependency on hardware. To finally break the silos associated with hardware, companies must begin setting themselves up to support any-to-any communication. In this environment, all services can virtually run anywhere across multiple sources of hardware that can be geographically dispersed.

Now that we know what can be done about legacy systems (transition to an open, software-enabled, meshed architecture), let’s discuss how companies can successfully integrate digital into their existing environment to transform business. Stay tuned for more.