Download Certificate- CMOs | ECIO | Most Admired Brand | Most Trusted Company

AI drives Data Centre to the edge


By MYBRANDBOOK


AI drives Data Centre to the edge

Data centers are expanding to the network edge to meet demand by artificial intelligence and other applications requiring fast response times which is not available in traditional data center architectures.

 

The problem with traditional architectures is their centralized framework. Data often travels hundreds of miles from the edge to computers, then back again. But AI and other emerging applications- Internet of things (IoT), cloud-based gaming, virtual reality  require much faster network response times, otherwise known as “latency.” That means data center processing must move to the network edge. Edge computing can take place in small data centers, roughly the size of shipping containers, rather than the warehouse-sized edifices that currently power the cloud.

 

Data center operators can still use their traditional structures, with fast networks and other hardware and software required to ensure the speedy response times needed for edge applications.

 

 “The number one thing [driving edge computing] is the amount of data being created outside the data center,” said Patrick Moorhead, President and Principal Analyst of Moor Insights & Strategy. The mini data center remains an emerging market.

 

Edge locations require a new kind of infrastructure. “By no means does it look like a traditional data center. The size is much smaller, with workloads requiring a lot of power density and interconnection density,” said Chris Sharp, CTO Digital Realty.   

 

Digital Realty competitor Equinix sees its existing facilities as serving edge needs for its service provider and enterprise customers, said Jim Poole, Equinix Vice President of Global Business Development. “

 

While Equinix can achieve low latency over fiber, edge applications require wireless as well, and wireless remains a bottleneck for AI and other emerging edge applications. Current 4G wireless latency is 40 msec at best, and the average is between 60 to 120 msec, Poole said.

 

 

AI driver

AI applications include two primary workloads: training and inferencing. Training is what it sounds like - teaching an AI model how to solve a problem. This process often involves organizing petabytes of data.

 

“Usually you need a lot of compute,” said Kaladhar Voruganti, Equinix VP of technology innovation. Training runs on power-hungry GPUs, with each fully loaded rack consuming up to 30 to 40 kilowatts. Training generally needs to run in a big data center to satisfy power requirements, as well as privacy and regulatory concerns in some applications. Digital Realty has partnered with Nvidia to provide the hardware vendor’s GPUs in colocation servers.

 

Once models are trained, the next step is inference, a process where the model applies what it has learned in training and puts it to work in a production application. “You might train it in the big cloud, and run the application and do the inference right on the factory floor, or Walmart, or the gas station,” analyst Patrick Moorhead said

 

These sorts of AI applications can be used in a variety of cases. For example, an airline company might use “digital twins” for predictive maintenance. Or, as the economy opens up from the Covid-19 pandemic, a business could use AI to run heat-mapping and facial recognition to identify people entering a facility who might be infected.

 

Other applications requiring edge compute (and frequently using AI) include gaming, IoT, smart factories, shipping and logistics. Additionally, retail technologies require edge computing to deliver needed responsiveness.

 

Finally, Hyper Clouds, enterprise vendors, telecom providers and data center operators all look like winners at the edge. “AWS is the big mothership. It’s slowly but surely fielding a credible edge offering,” Moorhead said.

 

The public cloud giant unveiled AWS Outposts, a hardware rack running its infrastructure software — the same infrastructure run in an AWS data center. On the software side, alternatives include AWS IoT Greengrass, an operating system that connects IoT devices to the cloud. Meanwhile, public cloud rival Microsoft provides its Azure cloud IoT-for-edge services while VMware also provides edge services. Moorhead said VMware is “surprisingly competitive in this space.”

 

Google, the other major public cloud vendor, has been a bit of a laggard, but is stepping up with its Anthos services for distributed cloud applications.

 

Emerging edge data centers vendors like Vapor IO also have an opportunity to redefine old technology, Moorhead reckons. “There have been data centers on the edge for 50 years. Any Walmart has a raised floor and a data center. If you go into a gas station or McDonald’s they have a server on the wall,” he said. “Where Vapor IO is really leaning-in is adding compute close to the network, specifically the 5G network.”

 

Telco central offices also can be repurposed as mini data centers, creating opportunities for carriers. “A typical neighborhood has a cement bunker with analog lines and a bunch of racks in it,” Moorhead said. “They’re almost empty now. They have a lot of power. They’re industrial strength — literally a cement bunker that would be hard to break into  and they have the power and cooling.”

 

“You need a global platform or you will have a hard time being successful. Customers are very cautious about doing deals with point providers in single markets. If you’re not truly invested and have the wherewithal to support a global environment, you’re not going to win”, says Sharp.

 E-Magazine 
 VIDEOS  Placeholder image

Copyright www.mybrandbook.co.in @1999-2024 - All rights reserved.
Reproduction in whole or in part in any form or medium without express written permission of Kalinga Digital Media Pvt. Ltd. is prohibited.
Other Initiatives : www.varindia.com | www.spoindia.org