You would use Airflow when you need to create complex data pipelines that have dynamic dependencies and require a high level of customization. For instance, if you have a scenario where you need to run ETL jobs that vary based on upstream data availability and require conditional branching, Airflow's flexibility allows you to easily manage and orchestrate these tasks. In contrast, Oozie is more rigid and better suited for static data workflows that don't change often.
You would use Apache Oozie when you have a Hadoop-centric workflow that involves a lot of batch processing jobs with dependencies, especially when these jobs are tightly integrated with HDFS. For instance, in a scenario where a data pipeline consists of multiple MapReduce jobs that need to be executed in a specific order, Oozie provides the ability to orchestrate these tasks efficiently, making it easier to manage dependencies and retries.