In Data Engineer’s Lunch #25: Airflow and Spark, we discuss how we can use Airflow to manage Spark jobs. The live recording of the Data Engineer’s Lunch, which includes a more in-depth discussion, is also embedded below in case you were not able to attend live. If you would like to attend a Data Engineer’s Lunch live, it is hosted every Monday at noon EST. Register here now!
In Data Engineer’s Lunch #25: Airflow and Spark, we discuss how we can connect Airflow and Spark and manage / schedule Spark jobs using Airflow. If you are not familiar with Airflow, check out some of our previous Data Engineer’s Lunches below that cover Airflow. Additionally, you can find the rest of the Data Engineer’s Lunch YouTube playlist here!
We have a walkthrough below, which you can use to learn how to quickly spin up Airflow and Spark, connect them, and then use Airflow to run the Spark jobs. We used Gitpod as our dev environment so that you can quickly learn and test without having to worry about OS inconsistencies. You can also do the walkthrough using this GitHub repo! As mentioned above, the live recording is embedded below if you want to watch the walkthrough live.
If you have not already opened this in Gitpod, then hit this link to get started!
We will be using the quick start script that Airflow provides here.
Open port 8081 in the browser, copy the master URL, and paste in the designated spot below
mv spark_dag.py ~/airflow/dags
If it does not exist yet, give it a few seconds to refresh.
example_spark_operator, and drill down by clicking on
Adminsection of the menu, select
spark_defaultand update the host to the Spark master URL. Save once done
DAGmenu item and return to the dashboard. Unpause the
example_spark_operator, and then click on the
In their logs, we should see value of Pi that each job calculated, and the two numbers differing between Python and Scala
airflow dags trigger example_spark_operator
airflow tasks run example_spark_operator python_job now
And that wraps up our basic walkthrough on using Airflow to manage Spark jobs. Again, the live recording of Data Engineer’s Lunch #25: Airflow and Spark is embedded below, so if you want to watch the walkthrough live, be sure to check it out!
Cassandra.Link is a knowledge base that we created for all things Apache Cassandra. Our goal with Cassandra.Link was to not only fill the gap of Planet Cassandra but to bring the Cassandra community together. Feel free to reach out if you wish to collaborate with us on this project in any capacity.
We are a technology company that specializes in building business platforms. If you have any questions about the tools discussed in this post or about any of our services, feel free to send us an email!