How to Explain Cloud Projects to a CFO

As tensions continue to increase in cloud-related discussions at the executive level, so has the importance of effective communication. Much of the debate on cloud investments revolves around one topic: OpEx. It’s understandable why many financial experts seek to avoid OpEx, but the value of investing in cloud services lies beyond this range.

An effective method to bridge this gap is to build a strategic plan, so that you are prepared to let the facts speak for themselves. This method allows for pure business value to be presented, while also giving equal consideration to the weaknesses and challenges faced. Common ground may also be easier to establish when both parties enter the discussion with a clear understanding of the advantages and disadvantages. It’s too easy to let tensions and emotions direct the conversation, so instead present a case grounded in research and thoughtful consideration. The following five tips will assist you in establishing a tested, well-developed plan for cloud implementation.

  1. Gather Research and Data (Know Your Numbers)

    Start by researching case studies that contain TCO (total cost of ownership) and the cost of production for comparable applications. Also consider watching demonstrations to learn how functionality works and how workflows can be implemented—this is pure empirical evidence that companies can try to replicate or expand upon.

    To further pique the interest of your CFO, share data that enumerates how your company will gain a high ROI—this will have the greatest impact on the direction of your conversation.

  2. Consider Feasibility

    Gauge the necessity of the cloud products/services under consideration by analyzing the scale of the project. Develop your own internal criteria based on the particular delivery timeframes, budget, global accessibility, etc. Then compare how your research matches specific project requirements and identify any challenges upfront. Standard guidelines also help to objectively compare applications and ultimately identify the greatest potential benefits. An additional area of consideration is security. There are a number of controls in the area of access, encryption and legal compliance issues, both global and domestic that must be addressed. Although this may seem like a no-brainer, it is often forgotten in the complicated world of cloud considerations.

    In everyday life it’s easier to see the folly in taking on a big endeavor without a coordinated plan. Imagine preparing for a dinner party without knowing how many guests will attend, when they are coming, if they have any dietary restrictions or allergies, and then attempting to cook this meal without a recipe—failure and chaos are expected, if not unavoidable. Luckily, through careful preparation all these mistakes can be easily avoided and the same is true for cloud implementation.

  3. Adopt Standards
    Creating standards is an absolute prerequisite for implementing cloud services, especially when using an agile process. You won’t get the full benefit of cloud if you don’t have standards. Self-service capabilities can be dramatically expanded through the use of standards at all tiers of the infrastructure and application development landscape.

    Examples of these standards include operating systems, middleware, communication protocols, storage access, development tools, development processes, development coding standards, monitoring, alert plans, scaling practices, and even server hardening practices. Additionally, security controls and individual corporate business models are also standards that should be considered. If you are planning a private cloud, ideally you would already have standards in place for the server infrastructure, storage, and networking—in addition to the items listed above. The goal of standardization across an environment is to create simplicity and consistency, which drives automation—the foundation of cloud in an SP-based or private cloud environment.

  4. Create a Prototype Environment
    This experimental approach provides the opportunity to try before you buy and is certain to impress your CFO. A prototype environment serves as proof of concept, which tests if the service is technically and operationally feasible. There are two main considerations within this.

    First is your ability to create and leverage the basic infrastructure as a service, IaaS, offered in your own cloud or that of a service provider. It’s the best way to obtain computing infrastructure without the capital investment. You will be paying for usage on a monthly basis, but ensure it is properly managed so budgets are not exceeded. Again, preparation is key! Get ready to tackle this concern head on and create a plan for how you will manage any issues. IaaS can be a great way to start a development process or even set up a production application deployment.

    Next, determine how it will impact your development process. Two important metrics to track include increased development speed and improvement in the overall cycle of development and testing. This can be achieved by leveraging the standards you have adopted and deployed in your cloud environment, which can be further enhanced by adoption of a DevOps model within your development teams and process.

  5. Think Scalable
    Managing cloud operations is different from rolling out a large capital-intensive project. Cloud services and features can be added and removed dynamically. With proper configuration and standards this can be truly elastic. However, you need to manage within an allocation to ensure you do not overconsume resources and create a negative budget impact. The benefit of it is to spend at the level you need to consume. But you would need to monitor the usage on an on-going basis to ensure that growing the allocation is a premeditated decision with proper budget consideration. Cloud itself cannot be a set-and-forget environment.

    Over time, the benefits of cloud investments compound as infrastructure and labor cost savings are realized through automation, workflow, self-service, etc. So, it’s important to fully seize the opportunity to communicate this tremendous value by directing the conversation to the facts. If you have given thoughtful consideration to the strengths and weakness of these topics, then you are in a better position to objectively analyze the full potential of cloud implementation. This knowledge will let you minimize the emotion of the conversation and develop a strong, well-informed position. With these tips in mind, you are fully prepared to put nebulous cloud conversations in the past.

Related Articles:

Wrangling the IoT: The Next-Gen Architecture We’ve All Been Waiting For

Technologies like AI, the IoT, virtual reality and data analytics are no longer enterprise luxuries, but means of survival in an era of rapid digital disruption. They’re transforming traditional processes, redefining roles and responsibilities, and reimaging the customer/brand relationship. Consider that five years from now, more than one-third of skills needed in today’s workforce will look different because of technological advancement. Three years from now, 100 million consumers are expected to be shopping in virtual reality. Data algorithms are now being used to positively alter the behavior of workers.

These technologies are no longer the basis for science-fiction movies like “The Terminator” or “The Matrix.” They’re here and now. Today, millions of people can watch chatbots argue with each other for entertainment. People are spending days in virtual reality, essentially living in an alternate universe.

Who’s to say that far-reaching movie plots like “Her” and “I, Robot” won’t become reality 30 years from now? We can’t say for sure, however, one thing we do know is that businesses must transition from legacy, hierarchal architecture to a next-generation platform so they can flexibly respond at today’s speed of digital change.

In a recent blog, I explored five key areas of this next-generation platform that every business must consider: next-gen IT, the IoT, AI/automation, an open ecosystem, and the customer/citizens experience. I tackled the first of these five areas: next-gen IT. Now, let’s explore what businesses should know about a next-generation platform in terms of the IoT.

The Only Way to Bring Legacy into Today’s Next-Gen World of IoT

Capitalizing on the IoT is an exponential challenge when core systems and applications are still running in a legacy-dependent environment. To succeed, companies must bring legacy into today’s next-generation world of IoT—a process with its own set of unique challenges.

For starters, the IoT is a vast and loosely defined concept. Some define the IoT simply as sensorous technology. Others, the interworking of various embedded devices that can collect and exchange data. The way I see it, anything that can connect to either a network or provide any sort of service (not just data collection and exchange) should be considered part of the IoT. Because virtually anything can be considered part of the IoT, it becomes difficult to implement one single solution designed to target all IoT requirements. Because of this, we see many IoT solutions on the market today (i.e., Bluetooth, WiFi, ZigBee, LPWAN) that support a range of different requirements.

These solutions also typically don’t use IP protocols, making them impractical in today’s world of any-to-any communication. With billions of connected devices in use today, companies must migrate away from non-IP technologies towards converged architecture to begin building process workflow automation based on IoT analytics. For example, consider a utility company that can automatically notify customers of the impact of an impending weather storm based on predictive analytics from sensors deployed throughout its power lines. The provider can then increase the reliability of their services while keeping customers informed on the severity of the storm using real-time data. As you can see, breaking the silos between various “data sets” (Big Data) is the key to building workflows that are impactful to customers and/or citizens.

The end goal of the IoT is to create automated (and in many cases data-driven) processes that generate the exact business or customers/citizens outcome you’re looking for. The right technology foundation is essential for turning this goal into a practical reality.

So, what’s the answer? An open, software-enabled, meshed architecture platform. This next-generation platform makes it easy to migrate from legacy architecture to begin securely deploying IoT devices that drive higher levels of efficiency:

  • Open, SDN architecture supports unmatched levels of IoT intelligence. The platform continuously learns and changes conditions as needed via constantly updated traffic flows. Consider, for example, asset utilization reports that detail up-to-the-minute operational activity, enabling decision makers to change course as needed for continual improvement and cost savings. Meanwhile, an open-sourced ecosystem offers programmable APIs that allow companies to customize their IoT services and applications to meet their exact needs.
  • End-to-end network segmentation delivers built-in, point-to-point security for up to 168,000 devices that can run on any vendor’s network. This is achieved through three core components—hyper-segmentation, native stealth and automated elasticity—that work in unison to effectively isolate and filter traffic from IoT device to destination. End-to-end network segmentation is inherently designed to secure the IoT ecosystem, and yet only 23% of companies currently have such a solution deployed.
  • An SDN-based IoT controller seamlessly manages the integrated IoT environment. Based on a multi-protocol controller that manages all service modules within the framework, the IoT controller can assign service profiles to open networking adaptors, manage interfaces into SDN program environments, expose north and southbound APIs, and more.

The fact is this: the IoT is a reality that’s only going to substantially accelerate. Three years from now, it’s expected that companies will be spending up to $2 trillion on IoT devices. Five years from now, analysts predict that the IoT will save consumers and businesses $1 trillion per year. In this same period, though, it’s expected that more than 25% of identified enterprise attacks will involve the IoT. During this time, many businesses will continue to struggle with IoT security and management.

We’re only seeing the beginning of what can be achieved with the IoT, but these possibilities are limited without the right technology foundation. The last three decades have seen humans manually providing input to generate desired outcomes, whereas digital enterprises are now using sensors as the input mechanism, combined with sophisticated automated workflows. Scary one may say, but nonetheless our reality.

Think about it: does a self-driving car need any input from humans? Not if the vehicle knows the driver’s calendar, destination and location of people you may need to pick up. It will automatically take the preferred route to keep you on time, find the closest parking space (smart parking), and even, if required, let people know you’ve arrived. At this point, humans are simply going for the ride! This is exactly why the right IoT foundation is so critical to digital transformation. It’s imperative that businesses invest in a next-generation platform that can deliver the simplicity needed to connect, secure and manage the ever-growing number of IoT devices. At the end of the day, a meshed architecture platform represents the best—and arguably the only—way to effectively reduce IoT breaches, rapidly innovate, and improve IT staff efficiency. The possibilities of IoT are seemingly endless for businesses with this foundation.

Up next, we’ll be tackling the third key area of a next-generation platform: artificial intelligence/automation. Be sure to check back soon!

The 2020 Network Is Here: Stop Visualizing and Start Deploying

At this point, it’s safe to say you’ve heard of digital transformation and the radical changes it’s driving within the enterprise as we approach the 2020 network. For example, up to 45% of activities that employees are paid to perform can now be automated. Companies are working overtime to identify security solutions that defend against vulnerabilities found in 70% of IoT devices today. The average business now offers customers up to nine engagement channels to be used across a vast array of devices.

Organizational boundaries are blurring. The speed of change has become relentless. Networking as we know it has been redefined. All of this, of course, has significantly changed the role of IT within the modern-day enterprise.

The days of troubleshooting computers and running phone lines are dead and gone. Today, IT represents the foundation for numerous key areas of business, many that far surpass the norm. CIOs are emerging as leaders of customer-facing functions, responsible for driving digital user experiences organization-wide. Business owners are strategically using IT to accelerate their core revenue-generating activities. Half are now collecting ideas through business unit workshops facilitated by IT. Driven by digital transformation, IT has changed to the point of no return.

Digital transformation, however, is far from over. Research makes it clear we’re only getting started. Consider the vast changes expected to occur over the next three years alone. Gartner predicts that by 2020:

  • 100 million consumers will shop in virtual reality
  • 30% of web browsing sessions will be done without a screen
  • Algorithms will positively alter the behavior of billions of global workers

By 2021, 20% of all activities will involve at least one of seven digital giants like Amazon, Facebook, or Apple. By 2022, a blockchain-based business will be worth an estimated $10 billion. It’s clear the potential of trends like the IoT, cloud, big data analytics, and robotics is far from fulfilled and will only accelerate substantially as we move forward.

This all leads to one very important question: what will the network of 2020 look like? This massive, continued change will surely place unprecedented demand on IT infrastructure looking ahead.

Almost a year ago, Principal Analyst at ZK Research, Zeus Kerravala, aimed to answer this question via an article published to Network World. In it, he outlined key challenges that lie ahead for companies looking to capitalize on digital transformation (i.e., lack of automation, nodal configuration, multicast deployments), as well as what the network of 2020 will look like. Terms like simplified, mobile-centric, enhanced for contextualized customer experiences, and hyper-converged rounded out a comprehensive list of components, all which are just as valid today as they were this time last year.

Over the last year, however, Avaya has worked to streamline the 2020 network by condensing the technology into five key areas that businesses across every industry must consider: deep and wide automation, improved scalability, built-in security, mesh architecture, and an open network ecosystem.

Five Key Areas of the 2020 Network Every Business Must Know

  1. Deep and wide automation:

    As enterprises start aligning IT around their core business priorities, they must work to support two levels of automation: the first for automating the network architecture itself, and the second for automating various business workflows. The first involves eliminating complex, nodal configurations (traditionally required for service deployment) in order to easily add capacity and scaling capabilities. The second involves adopting a powerful, open workflow engine to increase productivity and overall efficiency. Network and workflow automation are essential for achieving the utmost business success.

  2. Scalability:

    Traditionally, scaling your architecture required that you replace your existing nodes with faster ones. In today’s smart, digital world, however, companies must evolve traditional scalability from legacy hierarchal architecture to fabric-based architecture. This move will enable them to add capacity at will and simulate nodal configuration, much like VMWare did with the introduction of server virtualization. The industry needs an end-to-end simplified virtualized network.

  3. Built-in security:

    The static configuration of legacy architecture will never offer the right level of network security needed today, nor will it support the future of the CX. As such, companies must work to eliminate legacy downfall when deploying next-generation architecture. This means sharpening the blurry, gray areas of network security—for example, when employees’ devices begin fading in and out of Wi-Fi when roaming in the parking lot. With more connected devices and more ways than ever to compromise them, it’s imperative that the 2020 network deliver any-to-any, end-to-end, built-in security. In other words, end-to-end network segmentation complemented by sophisticated authentication, encryption where needed, and real-time threat protection.

  4. Meshed architecture:

    The 2020 network epitomizes freedom of deployment. It means companies can move away from traditional hierarchal deployment and finally mesh their architecture. No more linearly connecting parts. No more limitations of Ethernet loops. A natively meshed architecture will empower organizations with unparalleled resiliency and scalability end to end (not just within the data center). At universities, for example, this means hyper-segmented, end-to-end connection across multiple campuses. At a bank, this kind of connectivity can be deployed between branch sites taking full advantage of cloud-based services. The user possibilities and business outcomes are seemingly endless.

  5. Open ecosystem:

    We live in a world of software-defined everything: SD-WAN SD-storage, SD-data center. The fact is that we’re rapidly and inevitably moving towards an open-sourced ecosystem. To prepare for this reality, businesses must ensure those vendors they invest in offer open APIs. This enables them to truly customize solution features and capabilities to meet their exact business needs. The 2020 network will no longer just endorse proprietary systems—but businesses need to continue to be cautious about how to take full advantage of open-source code without increasing business risks through vulnerabilities.

It’s imperative that organizations educate themselves on the 2020 network, not only visualizing it but taking the necessary steps for deployment. The future of networking is here, and it’s going to influence and shape your business. To learn more about these five key areas of the 2020 network, read IDC’s all-new Networks 2020 preparedness report, sponsored by Avaya.

Personalizing the CX Requires Blood, Sweat, Time and Passion

Research undeniably proves that personalization is key for delivering amazing customer experiences. (After all, companies can’t provide just one customer experience—rather, they need to provide ongoing experiences that adapt and evolve as technology and customer needs change.) For example, a recent study found that nearly one third of customers desire higher levels of personalization when shopping. At the same time, 96% of businesses believe that personalization is what influences key purchasing decisions and inspires and strengthens customer loyalty. Personalization done right means customers are with you for the long haul.

Customers are hungry for more personalized experiences, and businesses understand the benefits in providing them. So why is it that 20% of companies have no plans to improve their personalization efforts?

As a consumer, I find this sort of inaction unacceptable. As a business leader, I’m perplexed why any company wouldn’t immediately begin to make the shift. The experiences a company offers its customers are its best chance at substantial differentiation. Differentiation means growth. More importantly, differentiation means survival. Organizations need to make customer experiences more personalized, and they have no time to waste. But this isn’t a simple undertaking. Personalization is more than just a buzzword. It’s a mentality, a company culture, a lifetime commitment. Above all, it’s something that’s expected by consumers today and generations to come.

What is a Personalized Customer Experience?

To deliver the personalization that customers desire, businesses must first understand what this really means. Personalization can be summed up into two words: contextual and predictive. Customers must be served in such a way that companies already know who they’re dealing with and how they want to be treated.

Let me give you a personal example to illustrate this. Anyone who knows me knows I love fashion, and I have a favorite retailer. Based on my shopping history and engagement with that brand, the company knows what size I am, what my color palates are, and what styles most appeal to me. They have every piece of relevant information about me to ensure my experiences are contextual and meaningful. So much so that the company can anticipate what products I’d like and dislike. For instance, they know to never suggest to me products from St. John (Vince, on the other hand, I’ll go all out for!).

By having this relevant information at the right time and by leveraging it the right way, companies can quickly create a contextual experience that’s tailored to their customers’ personalities. At the same time, they’ll be able to increase the amount of revenue they generate. In fact, according to the abovementioned study, nearly 60% of customers who have experienced personalization say it’s a notable impact on their purchasing. In my case, this is great for that favorite retailer (and perhaps not so great for my husband!).

The Only Way to Deliver True Personalization: Are You Ready?

The key to delivering this level of personalization is to find the most relevant information about each customer and use it to service them in a way that’s relevant to them.

How can businesses find this relevant information? Think of all of the data that exists about you on the web. Every action and transaction you’ve ever made lives online somewhere as part of your digital footprint. The information is out there. Companies need to be able to mine this information in such a way that it makes the customer feel special and attended to. But this can lead to a big problem: having too much information.

This is where the blood, sweat and tears happen. I wish there was a simple way to resolve this issue, but there isn’t. The only way to effectively work through this is to identify how large your customers’ digital footprints are and sift through that data to find what’s most relevant to them. The goal is to build customer profiles that reflect each individual’s preferences, behaviors and habits. After all, what every customer considers relevant is unique to them as an individual.

The good news is that there are technologies available to help minimize this grueling process. For example:

A customer engagement solution: But not just any solution. You need a platform that is truly multi-touch, enabling you to easily create, innovate, optimize and future-proof customer experiences. You must find a top-shelf platform with a proven ability to generate customer loyalty, retention, and repeat spending at the individual consumer level. Here are a few tips for finding your best solution—invest in a software-based platform that:

  • Supports easy drag-and-drop visual workflow capabilities
  • Supports multiple customer devices and operating systems
  • Identifies and preserves contextual data from every customer touch point to enrich all future interactions

Analytics: Again, not just any solution will do. You need a platform that will provide a powerful, contextual visualization of the customer journey across all touch points, enabling employees to make real-time decisions that will drive positive business outcomes. My tip for finding your best analytics solution: make sure the platform is truly integrated and that there are no silos. This integration enables businesses to flexibly collect, process, and analyze data across all real-time and historical systems to provide rich data visualization. To learn more about the power of a leading analytics solution, I encourage you to read this blog recently written by Avaya’s David Chavez. In it, he brilliantly breaks down how Avaya’s cloud-based analytics software platform, Fanalytics, transforms fan experiences at smart stadiums.

The goal is to know your customers so well that you can anticipate what they’ll want. If customers don’t know what they want, the contextual visualization you’ll have of them will show suggestions to make. As Steve Jobs once said, “A lot of times, people don’t know what they want until you show it to them.”

Two Things That Must Go Hand in Hand

Leaders in personalization understand the critical role that both technology and personal commitment play in driving success. On one hand, advanced technology helps breakdown silos, streamline the user experience, and personalize the customer’s journey across every touch point in their interaction.

At the same time, the way that companies actually use this information is just as important for coming out on top. We must care about our customers. We must be passionate about helping them. We must be their biggest advocates in order for them to become ours.

At the end of the day, customer experiences will always be human experiences. Personalization isn’t something that can be bought. It’s a belief that’s promoted and enacted organization-wide. Companies that have the right technology, supported by this belief, will go far.