Home Internet of Things Aerospace Apparel Energy Defense Health Care Logistics Manufacturing Retail

Edge Computing and the Industrial Internet of Things in 2019

What's in store for next year? Here are seven predictions for what to expect.
By Sastry Malladi
Dec 23, 2018

The year 2018 was formative for Industrial Internet of Things (IIoT) technologies and deployments. Looking ahead to 2019, here are seven predictions for the commercial and industrial IoT, along with an outlook on computing size, the value of true edge computing, closed-loop edge-to-cloud machine learning and more.

1. Survival of the smallest: IIoT analytics and machine learning (ML) companies will be heavily measured on how much they can deliver in how little computing.
As IIoT projects extend beyond cloud-centric approaches, the next step in the evolution of artificial intelligence (AI) and the IIoT will address the need to convert algorithms to work at the edge in a dramatically smaller footprint. According to Gartner, 75 percent of enterprise-generated data will be processed at the edge (versus the cloud) within the next four years, up from less than 10 percent today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues and huge cost advantages.

While the cloud is a good place to store data and train machine learning models, it cannot deliver high-fidelity real-time streaming data analysis. In contrast, edge technology can analyze all raw data and deliver the highest-fidelity analytics, and increase the likelihood of detecting anomalies, enabling immediate reaction. A test of success will be the amount of power or computing capability that can be achieved in the smallest footprint possible.

2. The market will understand "real" versus "fake" edge solutions.
As with all hot new technologies, the market has run away with the term "edge computing" without clear boundaries around what it constitutes in IIoT deployments. "Fake" edge solutions claim they can process data at the edge, but really rely on sending data back to the cloud for batch or micro-batch processing. When one reads about edge computing, the fakes are recognized as those without a complex event processor (CEP), which means latency is higher and the data remains "dirty," making analytics much less accurate and ML models significantly compromised.

"Real" edge intelligence starts with a hyper-efficient CEP that cleanses, normalizes, filters, contextualizes and aligns "dirty" or raw streaming industrial data as it's produced. In addition, a "real" edge solution includes integrated ML and AI capabilities, all embedded into the smallest (and largest) computing footprints. The CEP function should enable real-time, actionable analytics onsite at the industrial edge, with a user experience optimized for fast remediation by operational technology (OT) personnel. It also prepares the data for optimal ML/AI performance, generating the highest-quality predictive insights to drive asset performance and process improvements.

Real edge intelligence can yield enormous cost savings, as well as improved efficiencies and data insights for industrial organizations looking to embark on a true path toward digital transformation.

Login and post your comment!

Not a member?

Signup for an account now to access all of the features of RFIDJournal.com!

Case Studies Features Best Practices How-Tos
Live Events Virtual Events Webinars
Simply enter a question for our experts.
RFID Journal LIVE! RFID in Health Care LIVE! LatAm LIVE! Brasil LIVE! Europe RFID Connect Virtual Events RFID Journal Awards Webinars Presentations