Secure Data Center at the Speed of Business
Every organization needs to have visionaries who consider the art of the possible: What will the future look like?
One role every company needs to have is a Digital Strategist. Self-service has become the norm, everything from ordering pizza via an app on your phone to depositing a check from home. Companies need a Digital Strategist to determine how they will provide self-service to their external customers.
A second role every customer needs is a Cloud Strategist.
The Cloud Strategist needs to determine how the IT department will provide self-service to internal customers. The cloud strategy may include private cloud, public cloud, hybrid cloud and multicloud. As more companies hire developers to evolve their digital strategy, the lack of a cloud strategy leads to a host of problems today and in the future.
The first question I ask customers when meeting with them about data center modernization is: What is your cloud strategy? I have only had one customer say he had a cloud strategy. He replied, "I give every developer a credit card. They can use any cloud they want." This is the epitome of no cloud strategy. If applications are developed using any public cloud, they will need to be recompiled prior to being moved on-premise or to another public cloud. The use of containers and micro-services eliminates this issue, however, most organizations I speak with have not widely adopted containers yet. In the race to digitize, we need to slow down and strategize.
The reason developers tend to use public cloud is for the self-service capability. A Cloud Strategist must work together with the Data Center Architect to develop a modern data center that provides security as well as self-service and self-healing capabilities. A data center with these characteristics is often called a private cloud. The on-premise, private cloud, linking to a public cloud is called a hybrid cloud.
One strategy for hybrid cloud may include application workloads that run in the public cloud that access data that exists in the private cloud. A second possibility is that applications could run within a private cloud most of the year until additional resources are needed. A prime example of this would be in retail environments during times of high usage, such as during the holiday season. In this case, on-premise application workloads would be supplemented by the public cloud via cloud-bursting, allowing traffic to be directed to both the public and private cloud.
The reality is that it is a multicloud world. Most companies are using a combination of clouds: They use one or more public clouds and are at some stage on the journey to a private cloud.
Six years ago, I began speaking about "The Journey to a Private Cloud," and nearly all customers are still on that journey today. The reason for this is that all the required technology was not yet mature enough.
WWT has been helping customers put the pieces in place to reach the goal of a private cloud for years.
Data centers have traditionally been very siloed. Servers, network and storage were all hardware-based solutions managed separately. In 2001, VMware introduced the GSX server, which abstracted hardware from the software via a hypervisor, allowing the creation of multiple virtual servers on a single physical server. This was a game-changer, allowing virtualization and automation of servers, which could now be spun up very rapidly.
This was the first step in the journey to a private cloud. The next step was the automation and virtualization of storage. Over the past six years, we have seen the control plane abstraction from the data plane of the network via Software Defined Networking (SDN). This process virtualizes and automates the network. Although the server, storage and network are now all automated, it is still a siloed approach. Each silo is benefiting from the automation creating a Software-Defined Data Center.
However, this is not yet a private cloud.
The secret sauce to completing the journey to a private cloud is orchestration. Many products can add the orchestration layer and WWT can assist you in determining the best solution for your environment. The orchestrator provides a web interface where your internal users, such as developers who have the appropriate permissions, can choose which resources they need. The orchestrator will then go behind the scenes and spin up the required server, storage, and network resources in a matter of minutes, much like provisioning within a public cloud.
The benefit of this is that cost and security are controlled. Another benefit of orchestration software is accountability. Orchestration tools can keep track of which departments or individual users are consuming data center infrastructure, providing visibility into the costs incurred by individuals or entire departments.
New products have become available that provide intelligence never before seen. Elasticity is an important part of a private cloud, meaning server resources provisioned but not in use can be returned to the main resource pool to be repurposed. In addition, machine learning allows today's data centers to be self-healing. By understanding the traffic patterns and policies, any deviation of policy or traffic pattern can trigger a self-healing event.
Finally, the data center houses the crown jewels for any organization. It therefore must be protected.
No CISO wants to have his or her company be the next one on the evening news. The network itself is the only place that has full visibility to all traffic in the data center. By placing sensors or agents on the network itself we gain visibility into that traffic. As hackers become more sophisticated, we must have better tools to protect our data centers.
WWT has resources, insights and labs that can help our customers with every step on the journey to a modern, secure, self-service data center. The most important step is the first step, and that is planning.