Skip to content

Conversation

yesemsanthoshkumar
Copy link
Contributor

@yesemsanthoshkumar yesemsanthoshkumar commented Aug 15, 2020

Add extra links for google cloud operators - Dataproc

How it is implemented

  1. DataprocOperator pushes the details to xcom
  2. The value pushed into Xcom is fetched to construct the link for job

Note: I haven't worked with dataproc workflows so I haven't implemented links for WorkFlow related operators.


^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@boring-cyborg boring-cyborg bot added the provider:google Google (including GCP) related issues label Aug 15, 2020
@boring-cyborg
Copy link

boring-cyborg bot commented Aug 15, 2020

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, pylint and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://apache-airflow-slack.herokuapp.com/

@yesemsanthoshkumar yesemsanthoshkumar marked this pull request as draft August 15, 2020 14:43
@yesemsanthoshkumar
Copy link
Contributor Author

@mik-laj
I've added extra link for dataproc jobs.

However, I've the following questions.

  1. While running the pre-commit hooks, I see both pylint and fake8 running checks. I've marked both of them to disable line too long error for some lines inside the files. Why do we have both of them?
  2. I'm directing the link to the cluster if the job hasn't run and not generated any JobId. Is this approach valid? Do you want me to change this behaviour?
  3. I coudn't find the cluster_name argument in DataprocSubmitJobOperator for which I've added the links. Shoud I add the links to DataprocJobBaseOperator? Or Should I add on both?