Making Decisions at Cloud Speed: New Architectures, More Choices
Making Decisions at Cloud Speed: New Architectures, More Choices
IT users have a growing universe of options to deploy their applications and services. In today’s edition of the DCF Data Center Executive Roundtable, our panel of industry thought leaders examine how cloud and edge architectures are shaping customer deployment options.
Our panelists include Phillip Marangella of EdgeConneX, Shannon Hulbert from Opus Interactive, Schneider Electric’s Steven Carlini, Nancy Novak of Compass Datacenters and Infrastructure Masons, Rob Rockwood from Sabey Data Centers and Brad Furnish of TMGCore. The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier. Here’s today’s discussion:
Data Center Frontier: Cloud, colo, on-premises and edge … deployment options abound. What trends are you seeing in where customers are deploying workloads, and how are these decisions changing?
Nancy Novak, Infrastructure Masons: Perhaps the biggest change we are seeing is that as the decision-making process on adding capacity grows shorter, the current supply chain issues are necessitating that providers develop more fluid, rather than linear schedules that incorporate a greater amount of customer involvement throughout the process.
We’ve found that by collaborating closely with our supply chain partners we can identify potential issues early in the process and develop potential alternatives for our end users. Because, let’s face it, in their eyes “supply chain” problem isn’t viewed as a “get out of jail free” card for providers.
On a more macro level, we’re seeing our customers’ longer-range plans under continual revision, and this is a good thing. A pre-planned zone requiring 100MW a year ago is now sliced into smaller components, but with similar capacity requirements.
As for the “edge” we’ve come to view this term as a misnomer, and I think it has limited end user thinking regarding the need to quickly add incremental capacity without adding headcount and new capital requirements. Not every new requirement demands a multi-MW facility.
We think that what the market is really looking for is “White Space as a Service” where the provider takes care of all the normal activities associated with permitting, installation and even on-going operations to provide end users with a flexible eco-system enabling them to quickly add capacity where it’s needed (an existing data center or facility or remote locations) without disrupting their existing operations.
Shannon Hulbert, Opus Interactive: The biggest trend is….there is no “one size fits all” and the customer is savvier than they’ve ever been. The industry is maturing. Every business is different, operating with a plethora of applications and systems. Every application has its own requirements for storage, access, process, and compute. Even microservices within the application are being parsed out to enable individual elements to scale and update separately.
Solutions are evolving at every level of datacenter and cloud to meet a range of workload criteria. 63% of workloads in 2022 are off-prem and deployed across a landscape of environments by workload type and more. Factors that impact how decisions for where workloads reside include:
- Security & Compliance
- Budget
- Performance
- Latency
- Storage access
- Data governance
- Evolution of storage offerings to accommodate access needs requiring hot, warm, and cold storage are driving hybrid storage strategies similar to how cloud offerings are being sourced
- Evolution of service offerings for customer data services at the edge
Phillip Marangella: One of our guiding principles is Hyperlocal to Hyperscale Data Center Solutions, so we see a whole range of deployment requests from customers. While a lot of our efforts in recent years have been directed toward hyperscale facilities for large service providers, we also see interest in cable landing stations and edge-scale colocation developments, both greenfield in new markets and expansion in established edge markets.
This is all driven by the unique needs of our customers. Some want access to specific network solutions, others need substantial server capacity, still others are looking for proximity to new or fast-growing markets. And it’s important to note that more and more customers are making purchasing decisions based in some measure on sustainability concerns, sometimes even retiring older, less efficient on-prem data centers with cloud or colo deployments.
What’s really new for businesses around the world is wider access to choices, whether that means choosing from an array of major global cloud services or pursuing a more distributed or more centralized data center strategy. We see trends that are driving our customers into new markets, expanding access to the global digital economy, and they have different needs in terms of scale, connectivity, and timing, and data center providers have to help them succeed by delivering expertise, innovation, capacity, and, increasingly, sustainable solutions on a global scale.
Steven Carlini, Schneider Electric: Cloud companies have maxed out their organic design and build staff, and are partnering with the intention to grow. We are seeing record-setting capacity deployments, capacity permitting, and absorption rates (available space divided by sold or leased space) in large data centers. There is a changing business model in colocation data centers from building on spec to find an anchor tenant towards building to suit a specific internet giant.
The days of 100 megawatt and larger single tenant data centers are coming, and at the same time, we are seeing the extension of the cloud to local edge and the codification of telco at the edge as well. Other local edge applications growing fast are retail, transportation and manufacturing.
All in all, the industry is changing dramatically as the digital transformation takes hold and larger data needs persist.
“Cloud companies have maxed out their organic design and build staff, and are partnering with the intention to grow.”
– Steven Carlini, Schneider Electric
Brad Furnish: Customers continue to utilize all of these deployment options and some additional ones not mentioned. Every end-user has a legitimate business need that makes each of these deployment options highly sought after.
Cloud makes sense for a lot people who don’t necessarily have high computing needs, don’t have access or need to spend capital to own their own infrastructure, and/or need to provide a service to millions/billions of people worldwide where owning data center assets in all these locations isn’t feasible. Colo continues to be a solid option for the companies that need to own their own infrastructure but do not want to own/operate their own buildings or have needs to be in certain geographical regions where land/space is difficult to come by. Additionally, a lot of colo providers are continuing to improve their abilities to provide closer proximities to exchange points, increase densities, and allow for the adoption of newer technologies including liquid cooling solutions, especially immersion.
On-premises is still a sought after option because there are several instances where the financial model makes sense based on sheer size, product offerings, capital cash flow business models, and depreciation of assets. Edge is a major trend in the industry! The last couple of years has highlighted the need to have compute and access to compute in very nontraditional locations. The increase in remote workforces, the need for remote data collection/processing, needs for reduction in latencies, business continuity strategies, and ever evolving reach of technology is pushing everyone to their “edge.” In addition a lot of the technologies we as consumers are looking forward need very disperse distributed edge points of presence to operate effectively, think 5G, self-driving vehicles, smart cities, electronic healthcare services, etc.
Overall, we are still seeing a lot of companies and industries still looking at a hybrid approach across these deployment options. Since edge has a different definition depending on use cases/verticals, the other deployment options can fit within someone’s definition of edge strategy, but in a lot of cases customers are deploying workloads throughout each of these options and updating their definition of edge to include putting compute in completely new locations and in new ways (smaller compute nodes, liquid cooled solutions, different power sources, etc.)
Rob Rockwood: More than edge or on-premises deployments, customer core deployments are seeking efficient, renewable power sources at scale. Network service providers are increasingly expanding networks to enable points of public cloud access for these deployments.
NEXT: Are the data center and cloud industries making progress on diversity and inclusion?
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with DCF on LinkedIn, and signing up for our weekly newspaper using the form below:
More >> Making Decisions at Cloud Speed: New Architectures, More Choices