Many organizations across industries leverage "real-time" analytics to monitor and improve operational performance. Essentially this means that they are capturing and collecting data in lots from various systems and analyzing it in batches through periodic on-demand queries. By contrast, companies that are leveraging "streaming analytics" are continuously collecting and analyzing data and automatically course-correcting as events unfold - when there's still an opportunity to positively impact the outcome.
When I describe how continuous visibility and real-time course correction works with respect to operational intelligence, I think about how a GPS system works. When you're driving your car, your GPS system knows precisely where you are at any given moment and where you are heading. It's also continuously monitoring your progression and perpetually making small adjustments on your estimated arrival time. As you approach a plotted turn, it alerts you in real time and tells you how many feet you are away and then instructs you on which direction to turn the wheel. And if you miss the turn, it automatically adjusts and revises your course to get you back on track.
The value that streaming analytics and continuous monitoring delivers hinges on the ability to anticipate your needs, to recognize when your needs change, and to adjust proactively to meet your needs as they unfold. By contrast, if you're not leveraging streaming analytics and continuous monitoring, it's essentially akin to going online and printing out directions to your destination. Sure, mapping services offer step-by-step directions, plot a course on an actual map, and provide you with the distance and estimated drive time. However, if your course changes or something goes awry, a static sheet of paper isn't terribly useful.
While in this example the GPS represents streaming analytics and continuous monitoring, the static sheet of paper that you print out before beginning your journey is analogous to how batch processing systems work.
With Hadoop-based batch processing platforms, you run periodic queries on data lots, and from these on-demand queries you derive "near real-time" insight. However, because the data is monitored in intervals and not continuously, you can't detect changes and problems fast enough to address them before they cycle through and affect the customer and the broader organization. Essentially, once you're off track there's no way to course correct because your insight will never be current enough to keep up with your reality. This is because - without streaming analytics - there's perpetually inherent latency in the data analysis process.
For operational processes that rely on immediate detection and corrective action - things like customer experience management, fraud detection and supply chain management; for instance, batch processing simply isn't good enough. This is because each passing second associated with data latency is causing a costly ripple effect through the organization and the customer base.
Leveraging streaming data provides you with the insight you need precisely when you need it - when you can still make a difference. By continuously monitoring and making adjustments along the way, you improve the customer experience, prevent fraud and increase supply chain efficiency. This is the heart of the value that streaming analytics delivers with respect to operational intelligence.
About the Author
Dr. Dale Skeen co-founded Vitria with Dr. JoMei Chang in 1994 and oversees the technology direction of the company. Dr. Skeen is credited with inventing distributed publish-subscribe communication, with over a dozen patents in this and other related technologies. Dr. Skeen has more than 20 years of experience designing and implementing large-scale computing systems in the areas of distributed computing and database systems. Dr. Skeen is the recognized industry visionary for creating and developing Business Process Integration and Real-Time Business Process Analysis, two of the innovative foundations for Vitria's solutions.
Dr. Skeen is also a prolific author, having contributed to 10 books and written numerous journal articles on distributed computing and integration technologies. Prior to co-founding Vitria, Dr. Skeen was the co-founder of TIBCO Software, where he served as the Chief Scientist. Dr. Skeen has held faculty positions at the University of California, Berkeley as well as Cornell University. Dr. Skeen has a Ph.D. in Computer Science from the University of California, Berkeley.