I have to trigger a pyspark module from airflow using a sparksubmit operator. But, the pyspark module need to take the spark session variable as an argument. I have used application_args to pass the parameter to the pyspark module. But, when I ran the dag the spark submit operator is getting failed and the parameter I passed in considered as None type variable. Need to know how to pass a argument to a pyspark module triggered through spark_submit_operator.
The DAG code is below:
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("PRJT").enableHiveSupport().getOrCreate()
spark_config = {
'conn_id': 'spark_default',
'driver_memory': '1g',
'executor_cores': 1,
'num_executors': 1,
'executor_memory': '1g'
}
dag = DAG(
dag_id="spark_session_prgm",
default_args=default_args,
schedule_interval='@daily',
catchup=False)
spark_submit_task1 = SparkSubmitOperator(
task_id='spark_submit_task1',
application='/home/airflow_home/dags/tmp_spark_1.py',
application_args=['spark'],
**spark_config, dag=dag)
The sample code in tmp_spark_1.py program: