What is required for my custom destination to be a...
# connector-development
What is required for my custom destination to be able to support dbt transformations?
My destination writes records with the Hasura GraphQL Mutations API, which hits an underlying postgres instance. To enable dbt transformations for this destination, I assume the destination needs to have the underlying postgres credentials as well. How do I pass these credentials to a dbt project? (profiles.yml and dbt_project.yml files?)
Hi @Christopher Wu if your destination is part of the supported DBT adapter it will support custom transformation.
You have to set
supportsDBT: true
in your connector spec
And yes the connection configuration is stored in
dbt file.
What do I have to do to get my destination included in the adapter? My destination isn’t specifically a postgres destination, but I can update it to include the underlying Hasura postgres credentials as required params in its spec, and any other parameters required for dbt to run
Or do I need to build a new dbt adapter?
I would like to use the postgres adapter if possible
For custom transformation I don't think you need to do anything on the connector side, as it's the custom dbt project that will hold the logic of connecting to the destination database. @Chris Duong [Airbyte] do you confirm?
Who is responsible for writing the
file? If I understand Airbyte correctly, the normalization process writes it, but what if normalization is disabled? I only want to integrate DBT, not Airbyte’s built-in normalization
custom transformation operations is using some code written in normalization to generate profiles.yml for dbt this is achieved by the flag
supportsDBT: true
yes, but there is also a piece in the NormalizationRunnerFactory to determine the right params to send to normalization for doing that… so it’s not only just enabling the flag in the spec
Ok so I would need to update the NormalizationRunnerFactory to support my new destination, correct? Since the process for that may take a while and we need something more immediate, can I include the profiles.yml in the dbt project github repo instead, or in a custom dbt docker image? There won’t be any production credentials in it for now. Will Airbyte fail to launch my dbt project if it cannot run the normalization runner on my destination?
yes, you can include the
in your git repo, then you can specify in the arguments of the dbt cli in custom transformation where to get the
from with
flag But i’m not sure if not finding the normalization image for your custom destination would be a problem or not… you could give it a try and file a github issue otherwise as i’m not sure about custom transformations with custom destinations another good issue to create is also to be able to specify the adequate normalization image to use for your destination in the destination’s spec (instead of hard-coded in normalizationRunnerFactory) That way your
Hasura GraphQL Mutations API
could tell that it would be using normalization-postgres in order to generate the
Copy code
java.lang.IllegalStateException: Requested normalization for farosai/airbyte-faros-destination:0.1.37, but it is not included in the normalization mappings.
Looks like the destination has to be included in the normalization mapping. This is somewhat inconvenient, as now the usability of the destination is tied to the Airbyte release version. It would be great if I could specify the normalization-postgres image like you said^.
In your opinion, which option would have a faster (or safer) pathway to release? Updating the normalization mapping to support my destination, or adding an option to specify an existing normalization mapping name?
If I update the normalization mapping myself, how soon would it be included in a subsequent Airbyte release?
Can you add a comment of your use case to this ticket? https://github.com/airbytehq/airbyte/issues/7229 maybe @Marcos Marx (Airbyte), do you have a better answer on the timing/release process of contributions?