Spark Streaming allows processing of live data streams using Spark. It works by dividing the data stream into batches called micro-batches, which are then processed using Spark's batch engine to generate RDDs. This allows for fault tolerance, exactly-once processing, and integration with other Spark APIs like MLlib and GraphX.