You should be able to use BashOperator
. Keeping the rest of your code as is, import required class and system packages:
from airflow.operators.bash_operator import BashOperator
import os
import sys
set required paths:
os.environ['SPARK_HOME'] = '/path/to/spark/root'
sys.path.append(os.path.join(os.environ['SPARK_HOME'], 'bin'))
and add operator:
spark_task = BashOperator(
task_id='spark_java',
bash_command='spark-submit --class {{ params.class }} {{ params.jar }}',
params={'class': 'MainClassName', 'jar': '/path/to/your.jar'},
dag=dag
)
You can easily extend this to provide additional arguments using Jinja templates.
You can of course adjust this for non-Spark scenario by replacing bash_command
with a template suitable in your case, for example:
bash_command = 'java -jar {{ params.jar }}'
and adjusting params
.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…