Big Data/Analytics Zone is brought to you in partnership with:

Sarah is a Content Curator at DZone. She lives in Knightdale, NC and enjoys spending her free time reading, crocheting, and playing PC games. Sarah has posted 37 posts at DZone. You can read more from them at their website. View Full User Profile

IoT Creates Big Problems for Big Data

04.25.2014
| 5511 views |
  • submit to reddit

The Internet of Things (IoT) is here and imminently posed to burst onto the consumer scene. Startups and technological powerhouses alike are putting their time, creativity, and money into this new market which relies heavily on real-time data (even DZone has a new IoT Microzone!). But the benefits of IoT come with challenges -- especially for big data.

The sheer scope of the amount of data is intimidating when comparing the eventualities with modern database capabilities, especially considering that there may be 50 billion IoT objects by 2020. But it's not just the amount of data that is going to change; the type of data is changing as well. In an interview with Hamish Barwick of CIO, distinguished analyst Joe Skorupa states:

Existing data center wide area networks [WAN] links are sized for the moderate-bandwidth requirements generated by human interactions with applications. IoT promises to dramatically change these patterns by transferring massive amounts of small message sensor data to the data center for processing, dramatically increasing inbound data center bandwidth requirements.

And then, of course, there's privacy concerns. Increasingly large amounts of data will be tied to individual people in very personal ways. And while that information could potentially save lives, it is scary to think of what could happen if that data fell into the wrong hands. In the same interview mentioned above, Skorupa also says:

The recent trend to centralise applications in order to reduce costs and increase security is incompatible with the IoT. Organisations will be forced to aggregate data in multiple distributed mini data centres where [data] processing can occur. Relevant data will then be forwarded to a central site for additional processing.

Dana Sandu also believes that neither IoT nor big data can exist without stream processing, which he defines as:

Stream processing is a technology that allows for the collection, integration, analysis, visualization, and system integration of data, all in real time, as the data is being produced, and without disrupting the activity of existing sources, storage, and enterprise systems.

Sandu feels that stream processing is so essential to the future of big data and IoT stems from the idea that any IoT company will need a tailor-made data processing strategy.

Thankfully, the inter-dependencies that make IoT a challenge for big data also insure that if big data can adapt, it will see a gigantic return on any investment because:

All of these connected devices will generate unimaginable amounts of data, and all of this data will have to end up passing through data-processing entities.

Check out both of the full articles here and here.

Published at DZone with permission of its author, Sarah Ervin.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)