Operations
Scale
Stream

Event-driven Agents with Complex Event Processing in Flink

Session Abstract

Event-driven Agents calling LLMs can be combined with Pattern Recognition and Anomaly Detection in Apache Flink in smart ways to increase cost efficiency, avoid hallucinations and enforce predictable, deterministic behavior. Specifically in a business process context, this architecture provides opportunities for continuous real-time process mining.

Session Description

Specialized, event-driven AI Agents, in contrast to planning agents, provide unique value for continuously monitoring real-time event pipelines, business processes or technical logs in Apache Kafka. Streaming Agents can invoke LLMs directly from Flink for each event, but this approach can be very costly for high-volume Kafka topics, and lead to non-deterministic outcomes.

We showcase how Streaming Agents can be combined with Pattern Recognition and Anomaly Detection in Apache Flink in smart ways to increase cost efficiency, avoid hallucinations and enforce predictable, deterministic behavior.

High-volume event pipelines can be filtered very efficiently with Complex Event Processing (CEP) as a core library in Apache Flink for pattern recognition of sequences of events, as well as traditional ML models with statistical approaches to detect anomalies for critical errors and business opportunities.

Streaming Agents can then invoke LLMs in a second step, to classify or further analyze the detected patterns or anomalies, suggesting or triggering actions. Due to these specialized tasks, small models often perform great in this context to achieve deterministic outcomes.

Specifically in a business process context, this architecture provides opportunities for real-time process mining for ERP, manufacturing, supply chain and financial data to detect process issues and SLA violations earlier, reducing down time and saving costs by taking action immediately.