Seth Saperstein
12/25/2021, 5:21 AMdbt run --select <model>+
however to get the raw model to my dbt project, I don’t love the suggestion of hopping into the airbyte container, determining the normalization directory, and grabbing the generated dbt model, syncing that back to a dbt repo, and then integrating the dbt repo on the airbyte job configuration.
Has anyone found a better way of exporting dbt models? I’m also planning on running on dbt cloud and this configuration means that I cannot “deploy” models via dbt cloud when the source dataset changes. Alternatively, I could trigger the dbt cloud api but that isn’t possible directly via airbyte, which means I would then have to use Airflow to schedule the airbyte job and then kick of the dbt cloud job. This means that Airflow code must be written for new data sources and I’m looking to keep data integration and normalization self-service to speed up development time for new datasets.
If anyone has suggestions I’m all ears.Zawar khan
12/27/2021, 2:24 PMdbt run <model>+
• Optional: Have the dbt command trigger via a hook into your dbt cloud deployment rather than the github repo (so that dbt run history is in a unified view and dbt cloud alerting benefits are realized)docker cp