-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Support Passing in {{ target.user }} for BQ Query Comments #1110
Comments
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days. |
Please don't make it stale |
Hello - I'm going to close out this issue because this is neither a bug or feature request. Please ask these questions in our community related sites like Discourse or Slack. As an aside, you might want to take a look at environmental variables through - that might work since you need to supply this via an yml file config. |
@amychen1776 could you please let me know what I should do to make this issue a feature request? |
@iamtodor were you able to get it working with an env_var? |
@amychen1776 nope, I wasnt |
I'll adjust the issue then |
@amychen1776 thank you! |
@iamtodor My coworker just flagged that there is a way around this using a macro which would be the way to get target.user https://docs.getdbt.com/reference/project-configs/query-comment#advanced-use-a-macro-to-generate-a-comment |
Is this your first time submitting a feature request?
Describe the feature
Hey folks!
I have a question in regards to
Big query job comment
(job-label): recently we upgraded to 1.4.9 and I would like to utilize this API https://docs.getdbt.com/reference/project-configs/query-comment#bigquery-include-query-comment-items-as-job-labelsI walked thru the doc and I was able to have the default one with the config:
Our pipeline process looks like this: Airflow task calls dbt run from python operator. Obviously, we have plenty of tasks in the DAG. The business task is to gather information in regard to how many bytes are processed per hour per DAG. For this, I use the built-in region-us.INFORMATION_SCHEMA.JOBS from BigQuery. This is where labels come into play. As I mentioned previously python operator runs dbt task, and from the Python task I am able to find out the DAG name. Hence I am capable of putting the DAG name into dbt run command the way, for instance, we pass vars so I would like to ask/understand whether it's possible, and if so, then how?
I found that I also can add a custom comment in the way https://docs.getdbt.com/reference/project-configs/query-comment#append-a-custom-comment:
but dbt knows nothing about Airflow, while Airflow knows about dbt.
If I miss something I would be glad to clarify it. And, if I am moving towards the wrong direction I would be pleased to be corrected :)
I asked the same question in slack https://getdbt.slack.com/archives/C99SNSRTK/p1708089270625579 , but got no attention
Describe alternatives you've considered
Seems like there are no alternatives
Who will this benefit?
BigQuery users who would like to distinguish jobs per DAG
Are you interested in contributing this feature?
There is a chance, with all the context and pitfalls to be provided
Anything else?
Nope
The text was updated successfully, but these errors were encountered: