Most IT infrastructures have been built around the core principle of centralising data, largely so it can be better protected from the security of a data centre. However, with powerful mobile devices increasingly operating at the edge of your network, greater business demands for speed and productivity, plus new digital initiatives, old school thinking is being called into question.
Pushing data backwards and forwards over your network naturally creates latency that hampers the performance of applications, especially those harnessing information in real-time, which is increasingly the case. Consequently, many businesses are now choosing to re-examine the role and architecture of their networks to better service the operational reality they now face and the opportunities they wish to capture.
With this in mind, many businesses are looking to bring the compute and storage responsible for the analysis of their data closer to the point at which it is both created and accessed. It’s a concept known as edge computing. In fact, Gartner estimates that more than 50% of enterprise generated data will be processed outside of the data centre and the cloud, as more and more businesses look to develop efficient networks with edge computing at the forefront.
What is Edge computing?
Edge computing sees that data is stored, processed and analysed closer to the point at which people or things need it. And for this reason, it means breaking away from the centralisation of data that most organisations have become accustomed to.
In the world of edge computing there is no need to push all your data to a central location, such as the cloud or a data centre. You can compute this data at source, making it more readily available to local devices and applications.
One major driver is the increased adoption of Internet of Things (IoT) devices. These devices are not only collecting, creating and storing data, they are also connected to the internet and use this connection to communicate with other devices. To successfully harness the data these devices hold requires that data be processed and analysed, which in a traditional architecture means transfer to a central location. With so many devices now looking to send data, bandwidth is squeezed which creates network latency. Moving the processing of this IoT data closer to the devices themselves, and therefore closer to the network edge, helps to reduce this.
This process can take place on individual devices, which are commonly referred to as edge devices, or on a larger appliance, often called the edge server. Essentially, wherever computing power and storage is moved away from a centralised data centre or public cloud, this can be described as edge computing.
What are the benefits?
By shifting your workloads closer to the network edge, data can be analysed and accessed quicker, reducing any latency and increasing data-stream acceleration as a result. Accordingly, real-time data can be accessed more efficiently at source, and the amount of data that does need to be sent to a secondary location is reduced.
For many businesses, there is also a cost saving associated with edge computing. Many early adopters of the public cloud, for example, migrated with the expectation of reduced costs, but expanding capacity and shadow IT purchases have now seen costs increase beyond an acceptable level. As a result, many of these businesses are now looking to migrate back out, so by shifting the processing of large amounts of data to the edge, they are reducing the amount of data analysis that needs to take place in the cloud and can reduce capacity, and therefore cost.
Edge computing also offers a high level of network scalability. By shifting compute closer to the perimeter of your network, you can add additional capacity where it is needed without having to make any additions to your wider infrastructure. This not only makes network scaling faster, but also more cost effective.
How can I implement an edge computing model?
As you look to add more devices to the periphery of your network and process your data more efficiently, it’s important that you have a network able to cope with a shift in demand. An edge computing server might need to connect to the WiFi, LAN or Internet so connectivity options are important. Likewise, your network needs to be smart, agile and scalable, ensuring that the data you collect at source is directed to its final location via the most efficient route possible.
Adopting Software-defined WAN (SD-WAN) is one popular solution. With SD-WAN, management of your network is centralised, shifting the control plane from individual devices to a single software interface that allows you to configure all aspects of your network to implement changes in minutes. This is crucial when more devices (especially critical devices) will be sitting at the network edge, probably away from professional IT resources able to manage them in person. Unifying management also enables you to better identify the best data path for each application, ensuring that latency is reduced and giving priority to the data that needs it.
Equally, employing an appropriate network architecture is an important step, and this includes implementing an internet connection and hardware with the capacity to process more data at the edge. This is especially prevalent with IoT environments, where most data created by each device will be processed and consumed locally without any communication with the wider network.
Taking your first steps
The balance is unquestionably shifting towards edge computing, as more businesses look to decentralise their data with a view to delivering a more efficient infrastructure that meets the demands of the modern, connected workforce.
However, this can’t be done overnight, so taking initial steps to prepare yourself for a long-term shift to the edge is the best place to start. To learn more about the concept of edge computing, or to discuss your own requirements with one of our expect team, get in touch with us.