Edge Datacenter

What is an Edge Datacenter?

An edge datacenter is found closest to devices and end users (i.e., the edge network ) as part of a distributed, edge computing infrastructure. Typically featuring a smaller footprint, edge datacenters are designed to process time-sensitive data faster, while sending less critical information to a larger, centralized datacenter intended for big data analytics.

How does an edge datacenter work?

The edge datacenter functions as a connection between multiple networks, where the edge datacenter becomes an internet exchange point for a requesting device (e.g., mobile phone, laptop, etc.). In essence, edge datacenters become a conduit for multiple network and service providers to access localized compute resources, especially for cloud-driven functions like edge computing and machine learning (ML).

Edge datacenters are located closer to the users and their devices that collect and transmit data, or wherever data is being generated. Typically, they are powered by edge caching—hardware- or software-based components that temporarily store data in order to increase computing response time. Often, these components appear as micro-datacenters (MDCs), a modular system designed for workloads that can occur outside the centralized datacenter that can be scaled for specific needs. MDC components can include mobile so-called fog computing, which uses the cloud and data storage infrastructure to move data to preferred areas, or mobile edge computing (i.e., cloudlets), small cloud datacenters designed for mobile applications and devices.

Ultimately, edge datacenters are designed to turn collected data into usable insights, whether that’s enabling automated capabilities or processes like cybersecurity and threat analysis or gaining insight into device or infrastructure performance.

What are the benefits of using an edge datacenter? What are the challenges?

Benefits

The exponential rise of personal devices, streaming services, smart technology, artificial intelligence, and ML are both driving and informing the need for edge networks. Edge datacenters, and by extension edge computing, are a driving force for innovation like 5G and the Internet of Things (IoT). At the edge, closer proximity creates less latency, helping improve the user experience across many technologies, including virtual assistants and self-driving vehicles. For businesses, edge datacenters are a vehicle for increased IT reliability and security. Rather than transmitting data back to a central database for processing, edge datacenters can take over some processing duties, helping free up resources and bandwidth. Lower transmission volume and fewer cloud uploads help improve security; more compute and storage resources at the edge mean more data is kept safe in a physical datacenter. And since edge datacenters are available as a service, companies can offset costs with a lower total cost of ownership with a managed solution.

Challenges

Several inherent advantages of edge datacenters also come with a catch. Since edge datacenters must be close to endpoints, edge infrastructure can quickly become complex, requiring many disparate systems in several geographies, experienced IT teams to oversee them, and comprehensive security to protect them all. In some cases, rapid edge network expansion can add major upfront costs—and potentially more if datacenters are over- and under-provisioned.

Additionally, the low latency that edge datacenters can enable must be maintained to ensure ideal or expected performance, usually between five to twenty milliseconds. More devices can crimp bandwidth and create gaps in performance. The constant upload and download from endpoint to cloud to datacenter can crimp bandwidth as the network and/or devices continue to grow.

How is an edge datacenter different from other datacenters?

The core difference between edge datacenters and enterprise datacenters is location. Enterprise data centers are almost exclusively deployed on-premises, custom-built for specific organizational purposes and workloads. On the other hand, edge datacenters are located closer to where data is being generated—strategic locations that enable high connectivity rates with minimal latency.

Despite leveraging some of the same capabilities as its larger counterparts, an edge datacenter has a much smaller footprint, usually covering a data-dense area with multiple deployments for handling high volumes of data without sending it to the cloud or an enterprise datacenter. Additionally, these installations are newer, compared to more legacy enterprise setups, meaning they are better equipped for accommodating IoT devices and high-demand content.

HPE and edge datacenters

HPE is the edge-to-cloud company, offering a robust portfolio of products and services that can help companies drive and support innovation, whether that’s telecommunications, manufacturing, entertainment, and beyond.

HPE Edge Centers, a self-contained, single-cabinet, modular software-defined datacenter, enables any IT environment at the edge. It protects compute, network, and storage infrastructure while providing an industry-first edge management control system that enables edge management automation. The HPE Edge Center provides flexibility to deploy computing fast and efficiently, making it an ideal solution for Industrial IoT as well as other enterprise edge or AI workloads.

Elsewhere, HPE Edgeline Converged Edge Systems integrate key open standards-based OT data acquisition and control technologies directly into the enterprise IT system responsible for analytics, delivering fast, simple, and secure convergence between hardware and software components. This convergence of OT and IT capabilities helps reduce the latency between acquiring data, analyzing it, and acting on it, while also offering a smaller footprint. Customers can expect to gain real-time, local decision-making to support immediate action and achieve autonomous operations, even with unreliable connectivity or lack of it. And world-class security and compliance is maintained at all times.