You would use Apache Spark Streaming when you need to process both real-time and historical data seamlessly, leveraging Spark's powerful data processing capabilities. A specific scenario is when you want to analyze user activity logs in real-time and also perform complex analytics on the same data in batches to obtain insights over time, such as trending topics or user behavior patterns.
Apache Storm is preferable when you need extremely low-latency processing for real-time applications that can handle millions of messages per second with complex event processing requirements. A specific scenario would be for processing continuous real-time data streams from a financial trading platform where transactions need to be analyzed immediately to make quick decisions.