Keywords: data stream analytics, continuous and automatic knowledge creation, anomaly detection, data quality.
We are deluged by data on an unprecedented scale, from telemetry, sensor networks like the Internet of Things (IoT), virtual reality applications and other data streams, in addition to the already daunting amounts of data from traditional data collection mechanisms. Automation and continuous adaptation are crucial for creating meaningful data “wisdom" in the form of statistical models, signatures and rules, based on such dynamic, highly volatile data streams. Creating reliable and accurate knowledge and wisdom from data streams requires monitoring the data continuously for gaps, bumps, inconsistencies and other quality issues. Frequently, tools and methods used for quality monitoring are similar to those used for wisdom and knowledge creation, e.g., anomaly detection methods. In this talk, we will give an overview of anomaly management as an end-to-end process that is intertwined with the wisdom and knowledge creation process, and discuss different methods, algorithms and tools for defining, detecting, explaining, prioritizing and remedying anomalies in a dynamic stream environment. We will illustrate with examples drawn from real world data.