Introduction
The traditional method of analyzing data once it has been collected and saved is no longer fit for purpose. Today’s technology requires data analysis while it is being created, while it is being transferred from one device to another, and while it is at its destination (i.e. while it is stored). Technology that just analyzes stored data will be two steps behind – by the time it starts, devices with smarter analysis will already have answers.
More and more data is generated every day. As more of this is stored, more of it will need analyzing. To do this quickly, event stream processing will need to analyze time-based data as it is being created – even at the instant it is streaming from one device to another.
Traditional analytics applies processing after the data is stored. However, for many situations in the modern world – according to staff writers for IoT Hub – this is far too late. What needs to happen, in many instances, is for the data to be processed as it occurs. This is the only way to make efficient use of operational insights.
Furthermore, even the cost of IoT deployments is driving the uptake of event stream processing. Smart devices are inherently expensive to make. To make a ‘smart’ device, it needs to have good chips, and if you want the intelligence on the device itself, then batteries are needed for the additional power. A way around this would be to centralize the intelligence elsewhere – as this reduces the cost of the individual devices.
At-the-edge analytics simply means any data that’s processed on the same device from which it is streaming. However, this type of analytics operates with only minimal context to the wider world and often within simple rules and statistics, like standard or average deviation. Simple commands can be operated in this way, for example turning something off or on, like thermostats.
In-stream analytics, on the other hand, occurs as data streams from one device to another – or is coming to a centralized point from multiple sources. This type of analysis combines many different events and formats and relies on a much richer context. This can then be used to identify more complex patterns, or even connect a desired chain of actions.
Finally, at-rest analytics occurs when there is historical data to compare it to. This can include saved data from event streams as well as external information. This is the kind of analysis that happens after the event has occurred and is most useful for forecasting.
With the large amounts of data this uses, high-performance analytics is required for effective processing. That said, time can still be saved by normalizing and cleansing data while it’s in motion and before storage.
All three kinds of analytics can be used – and should be used – in a multiphase analytics system in optimizing the decisions that should be made. Multiphase analytics can analyze data throughout the event spectrum to inform what is needed where, when it’s needed, and what patterns are emerging for the future.
Summary
· Increasingly, data needs processed in real time, not after it has been stored
· In-stream processing is therefore becoming more important to analyze data efficiently
· This is in addition to at the edge analysis (analysis on the device creating the data) and at rest analysis (where the data is stored)
Big Data and related technologies – from data warehousing to analytics and business intelligence (BI) – are transforming the business world. Big Data is not simply big: Gartner defines it as “high-volume, high-velocity and high-variety information assets.” Managing these assets to generate the fourth “V” – value – is a challenge. Many excellent solutions are on the market, but they must be matched to specific needs. At GRT Corporation our focus is on providing value to the business customer.
How to Use Predictive Maintenance for Your Industrial Electric Motors
Electric motors are critical to today’s industrial processes. In recent years, they have used as much as 68% of...