search
close-icon
globe-icon
Global (EN)
Data Centers
PlatformDIGITAL®
Partners
Expertise & Resources
About
Login
Talk to a Specialist
banner
Articles

Living on the Edge: What You Should Know About Edge Computing

Rick Moore, Director, Cloud Services
February 5, 2019

One of the biggest trends in networking and IT infrastructure of late is edge computing. As more connected IoT devices come online and put greater demands on networks, administrators and other IT staff are struggling to ensure their infrastructure keeps the pace. Edge computing has emerged as a potential solution, placing IT resources and applications at the edge of the network, farther away from the traditional centralized core, but is that a strategy worth pursuing?

What is edge computing?

The edge simply represents a way to make systems and applications more efficient by removing components, data or services from a centralized core and placing them closer to the logical extreme (the “edge”). In other words, data is processed closer to its origination (or final destination), reducing overall round trip network latency.

Typically, this architecture calls for computing power, storage and data microservices to be redistributed accordingly. But it’s important to note that, while the performance of certain workloads may be improved by deploying them closer to the edge, others can (and should) remain at the core.

Location’s role in the network

Location matters, even (and especially, I’d argue) in today’s cloud-dominated world. The recent focus on proximity, along with the greater distribution of services at the edge of a network, is being driven by several key emerging trends, including the Internet of Things (IoT) and its increasingly complex applications.

Gartner forecasts that 20.4 billion connected things will be in use worldwide by 2020.

As the volume of internet-connected devices increases, so too does the need for greater access to edge computing and processing power and database resources. In some instances, the benefits of establishing connected endpoints for IoT devices won’t be fully realized unless underlying infrastructure is located closer to the edge in order to reduce latency, improve availability and support time-sensitive operations.

Further accelerating this push to add additional infrastructure at the edge are resource-intensive technologies like Artificial Intelligence (AI) – which is already being implemented ubiquitously. It’s important to note, however, that even edge-based AI and Machine Learning (ML) applications often require efficient access to core resources in order to function effectively. In this sense, the relationship between edge and core is highly complementary.

Once you’ve factored in the additional complexity introduced by highly distributed groups of users and partners that need efficient access to sophisticated business tools in order to complete complex business functions, it’s easy to see the importance of establishing a clear, thoughtful edge strategy upfront.

Potential benefits of edge computing and its drawbacks

So why have some organizations embraced an edge-first strategy, while others seem more reluctant to do so? Let’s break down a few of the pros and cons.

Pros:

  • Edge computing can reduce latency, as there’s less distance and fewer network points-of-presence between the edge and the actual location where much of the underlying data is processed and analyzed.
  • As its name implies, edge computing places dedicated resources closer to end users and devices, which can free up core architecture and valuable network assets for other mission-critical tasks.
  • Edge computing adds the potential for improved network resiliency by providing alternate paths for data transmission and communication.
  • Edge computing is often ideal for supporting distributed groups of customers, employees and partners. Instead of establishing remote connections from a centralized location to provide access to data and services, IT staff can place infrastructure at the edge and ensure that teams in a global environment utilize the applications and tools they need in a more efficient, less costly manner.

Cons:

  • Implementing a thoughtful edge strategy can be expensive and complex. Additional equipment, resources and planning must be considered. Of course, the endgame is to improve efficiency and, in many cases, reduce costs. But a significant upfront investment is often required.
  • There is a skills gap at the edge. Quite simply, many organizations are still learning how to fully utilize the data they capture from, or distribute to, edge locations. And there’s a shortage of developers who can write new apps for emerging edge use cases, including the “killer” apps many businesses hope to develop for true business transformation.
  • Ensuring adequate physical and logical security also can be challenging in a distributed environment, especially when implementing IoT deployments. As more data is being processed outside of the core, the risk for a leak, theft or a cybersecurity breach rises significantly.
  • There is a growing number of network and service providers operating at the edge. That means the additional complexity that’s introduced by various systems will make it more difficult to maintain appropriate levels of interoperability and integration at various layers of the solution stack.
Edge computing and modern IT networking best practices

4 Things to consider when adopting an edge computing strategy

  1. Complete systems. It’s helpful to think in terms of complete systems, not individual endpoints. As the number of IoT devices continues to increase, structuring your network architecture with a goal of making each device work together in an autonomous system can be helpful. Systems can then be coordinated, automated and “trained” appropriately.
  2. Mind your edge. As the number of edge computing resources grows, so will the number of data repositories within your ecosystem. It’s important to begin thinking of methods for placing compute, storage and data elements closer to your end users without sacrificing the inherent benefits of maintaining a centralized core data center (for example, provider interconnection density, power efficiencies, economies of scale, etc.).
  3. You must account for human supervision. Placing resources at the edge will undoubtedly require some level of intervention and management, at least for the foreseeable future.
  4. There will always be a need for centralization. Centralized data lakes, core infrastructure and deep analytics capabilities are necessary to accomplish many of the “heavy lifting” tasks associated with your workflows. Even today’s most powerful micro data centers in edge locations are not armed with the engineering specifications, sufficient network connectivity or geographic proximity to critical resources that are necessary to take on the full task of executing end-to-end business processes at scale.
What does the future hold for edge computing?

In 2019 and beyond, what should enterprises expect? And how will these advances impact the network and IT infrastructure?

First, consider the changes in store on the connectivity front. Advances in SDN, 5G and other networking technologies will continually reshape the role of edge computing. When data travels efficiently from edge to core (and back to the edge again), businesses will be enabled to position their assets where they’re needed most.

Ultimately, most companies will take advantage of AI, ML and many other emerging technologies in the cloud. That will further transform the edge-to-core landscape and enable new customer use cases. The largest cloud providers employ massive resources to build applications and cutting-edge services, and the unique competitive advantages of each provider are becoming more evident as the market matures.

So is edge computing worth the investment?

There’s little sense investing in edge technology or assets without having a specific business purpose in mind, otherwise you’ll introduce unnecessary complexity and cost into your infrastructure. In some instances, for example, shaving milliseconds from network latency could make the difference between choosing to invest or not. But it’s key to identify specific challenges first, then take a thoughtful approach to solving for them – at the edge or within the core.

For most companies, even those with distributed IoT footprints, a combination of powerful core data center infrastructure, flexible cloud computing services and edge-based network elements will prove to be the right choice. Adding greater redundancy and scalability to any network is important, but that can often be accomplished with more traditional tools and resources.

Tags