Home Internet of Things Aerospace Apparel Energy Defense Health Care Logistics Manufacturing Retail

Why IoT Companies Are Turning to Intelligent Data Distribution

Three core issues—latency, efficiency, and scale—must be addressed in order to solve the distribution dilemma.
By Andréa Skov
Nov 08, 2017

The Internet of Things (IoT) is everywhere. The market is expected to grow from $170.57 billion in 2017 to $561.04 billion by 2022, at a compound annual growth rate (CAGR) of 26.9 percent, according to a June 2017 MarketsandMarkets research report. With growth comes challenges, especially in relation to technology.

According to the report, "The fundamental problem posed by the Internet of Things is that network power remains very centralized. Even in the era of the cloud, when you access data and services online you are most often communicating with relatively few massive datacenters that may or may not be located conveniently close to you."

"That works when you are not accessing a lot of data and when latency is not a problem," the report continues. "However, it does not work in the Internet of Things, where, for example, you are monitoring traffic at every intersection in a city to more intelligently route cars and avoid gridlock. In that instance, if you wait for the data to be sent to a data center hundreds of miles away, processed, and then send commands back to the streetlights, it is too late—the light has already changed."

In addition, the Internet is a congested and often unreliable transport mechanism for time- and business-critical data delivery. Yet, according to IBM, we currently have 1 trillion devices exchanging 2.5 billion gigabytes per day over the Internet. However, computing location is only part of the story. Getting the right data from the right device at the right time is not just about hardware and sensor location—it is about data intelligence.

There are three core issues which must be addressed in order to solve the data-distribution dilemma. The first is latency. Information becomes irrelevant if the data does not get to the user fast enough. Time delays kill revenue opportunities in monetary transactions like e-gaming and stock trading, and cause loss if, for example, data from sensors on an oil rig are delayed, or patients can die if critical health information is not received. Seconds are no good. Data must be sent and received in milliseconds, often in volume and to thousands of recipients.

The second issue is efficiency (i.e. "one pipe in and one out"). The magic calculation is message size X message frequency X number of connections = throughput - how many gigabits of data per second can get through the pipe. If messages are smaller, you can either transmit more messages or hold more connections. Message size is the only factor in this equation that can be affected because applications must serve however many messages or connections are required in any given situation.

This brings us to the third issue, scale. You need to be able to scale over a busy and sometimes unreliable network. Many companies try to shoehorn inefficient enterprise messaging technology into their software, or use open-source and try to build their own. Unfortunately, organizations are trying to solve scalability issues using traditional techniques and solutions which weren't built for the IoT world. These technologies don't scale.

Login and post your comment!

Not a member?

Signup for an account now to access all of the features of RFIDJournal.com!

Case Studies Features Best Practices How-Tos
Live Events Virtual Events Webinars
Simply enter a question for our experts.
RFID Journal LIVE! RFID in Health Care LIVE! LatAm LIVE! Brasil LIVE! Europe RFID Connect Virtual Events RFID Journal Awards Webinars Presentations