This document discusses using Spark Streaming to process streaming IoT sensor data from locomotives to detect potential issues. It describes how sensor data from locomotive wheels and tracks is ingested from Kafka and enriched with metadata from HBase. The data is analyzed using Spark Streaming to detect anomalies indicating damage. Detected issues trigger alerts and allow visualizing sensor readings in Grafana. The architecture stores timeseries data in HBase or OpenTSDB and indexes readings in Solr for querying. It aims to proactively prevent accidents on railways through real-time anomaly detection.