Event Stream Processing is constantly growing and, most importantly, is spreading on business cases previously only related to batch data processing. Event stream processing has been proving essential on three use cases categories:
Where performances and throughput are critical factors and decisions must be taken within milliseconds: Cybersecurity, Sports Telemetry, Fraud Detection, Safety, Intelligence
Where computation on event-based data sources (data streams) is carrying a significant – or crucial – part of the value coming from the resulting analytics: IoT based data sources (Automotive, Industry 4.0, Home Automation applications), Mobile App, Clickstreams.
Where the business case involve the transformation/aggregation/transfer of data residing in heterogeneous sources with serious constraints and limitations, such as mainframes and legacy systems: Mainframe off-loading, supply chain integration, core business processes.
Combining multiple and heterogeneous data sources is one of the most critical challenges for IT departments. Addressing the “offloading” use case is becoming a priority for companies willing to get the most from their data. A good offloading case will include:
- Getting data out of legacy system in near real-time
- Move data in a modern and efficient infrastructure
- Process data in near real-time in order to enable several business scenarios (among them: customer 360, context aware promotions, supply chain optimization… etc)
- Enable the application of machine learning techniques and A.I. within the data processing pipeline (e.g. predictive maintenance, smart grid, smart supply chain)
Via Pirelli, 11
Via Pietro Borsieri, 41
1015 re Amsterdam
North end road, 404
sw6 1lu London