Leveraging DevOps to Meet the Expectations of IoT-Driven Edge Computing

By Philipp Schöne

DevOps teams are already among the busiest within any given IT organization, as they are always on the cutting edge of producing new technologies.

Throughout the last few years, DevOps—a set of practices intended to automate processes between software development and IT teams—has emerged as an important discipline for enterprises as they bring a new generation of applications to market. Enterprises have invested heavily in DevOps teams to achieve several goals, including streamlining application development, deploying new applications to production, evaluating public cloud options, moving certain workloads to the cloud, and establishing and optimizing continuous application delivery processes.

It's not an overstatement to say that DevOps teams are already among the busiest within any given IT organization, as they are always on the cutting edge of producing new technologies. So maybe it shouldn't be surprising that the DevOps plate might be getting even fuller with the advent of IoT-driven edge computing. Before we look at how enterprises can leverage those DevOps teams, let's take a look at what exactly we mean by IoT-driven edge computing.

Edge Computing Is Edgy
Edge computing is a fairly new type of network functionality that offers connected computing and storage resources in the connected devices themselves, outside of a company's data center or public cloud environment. Any connected component, whether it is a set-top box, Wi-Fi access points or a smart meter, can act like a tiny data center at the edge. These small data centers or edge nodes also happen to be situated very close to users.

This enables a dramatic improvement in customer experience through low latency interaction with compute and storage resources that are just one hop away from a user. For example, if you are hanging out in your kitchen, your new IoT-enabled fridge, which is laden with sensors and computing power, is in close proximity and can be leveraged as a small data center, communicating with other systems distributed around your house.

With the Industrial Internet of Things (IIoT) beginning to take off, smart sensors are everywhere and are proliferating like crazy. Manufacturing processes are becoming more and more automated with fewer humans in the loop, and there is more machine-to-machine communication based on event-driven architectures. At the heart of IoT-driven computing are application programming interfaces (APIs), which connect different devices and enable data to flow between them, to the cloud, to users, to manufacturing back-end systems and more. APIs make it possible for microservices-based applications to work together, and edge computing will take managing APIs to the next level.

Handle With Care
The promise of edge computing, which is able to distribute applications as close to end users as possible, in order to realize performance improvements and leverage wasted computing power, must also come with some "handle with care" instructions. Remember not so long ago, when IT teams were warning that public cloud data centers could never be as secure as on premises, so deploying production applications there would be ill-advised? There were good points to be made then, and the public cloud vendors have spent a lot of time, money and effort ensuring that the cloud is as secure as possible.

Still, a lot of large enterprises will not use the cloud for mission-critical applications. And it isn't as though large enterprises are all doing an amazing job at securing their own data centers, as evidenced by a spate of recent headlines, including the ongoing Equifax saga.

So imagine a scenario in which, instead of having a few data centers to worry about, you had thousands, or possibly millions, of edge computing devices connected via APIs, acting as miniaturized data centers, distributed around the globe, all communicating with one another and the mothership data centers that the enterprise is running. Now, with edge computing, the surface area for a malicious attack is massively greater than ever before. This is where DevOps teams must take everything they have learned from working with development teams on distributed microservices-based applications and apply it out of the gate when it comes to edge computing.

Giving Up Total Control While Maintaining Control
DevOps teams have spent a lot of effort establishing controls and processes for continuous application deployment, in which development teams are able to work on parts of the application in isolation, and update it whenever new functionality is ready. However, with edge computing, absolute control is not an option, as the applications themselves will be running at the edge, outside of a secure data center.

This is where the DevOps pros must ensure that the APIs that connect the different services are as secure as possible. They must also set up proactive monitoring to ensure that the applications are working as intended and have not been breached. If, for some reason, something goes wrong—and it is almost guaranteed that something will, indeed, go wrong—the team must have a rollback plan in place.

For example, let's say your AI-based manufacturing devices are fed malicious algorithms and conspire with one another to launch a bot attack. The DevOps team must have a plan in place to shut down that area of the edge network, and as soon as a patch is developed to address those issues, it must also be able to deploy the fix across all infected machines seamlessly. The scale will be daunting, so testing on a small scale and ramping up will be prudent.

DevOps teams are well compensated, and rightfully so, but the time to ask for a raise may be coming shortly to an IoT-based enterprise near you. After all, DevOps is on the hook for when things go wrong, and, as we have seen in this article, their job is about to become a lot more complex.

Philipp Schöne is a product anager at Axway for the Axway API Management Product. He works closely with customers to help them adopt an "API-first" approach to their integration strategies as they extend the boundaries of their enterprise to incorporate new cloud and mobile channels. With a strong background in API management, architecture and SOA security, he also advises Axway partners and customers on their cloud integration, APIs and identity-management capabilities, in order to advance the development of Axway's API solutions.