I need help locating the appropriate dbt file use ...
# ask-ai
q
I need help locating the appropriate dbt file use for Normalization by Airbyte. For now, I want to ensure the dbt on Airbyte work for my use case. As it stand, if I turn Basic Normalization on, everything work except I got errors on a particular rows in a particular tables, the errors is coming because value in a column is not a valid timestamp. So, what I want to achieve is customize the dbt sql for that particular model, by adding Case conditions. The challenge I face is that workspace/number/number, continues to change for each run of the connection in Airbyte UI, so I need guide to access the appropriate file used by dbt. Thanks Airbyte 0.44.3 version Error log from Airbyte
Copy code
Invalid timestamp: '+12699-01-07T18:00:26.673000Z'
  compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/airbyte_bigquery/mytablename_scd.sql
40 of 63 ERROR creating incremental model airbyte_bigquery.mytablename_scd....................................... [ERROR in 6.40s]
Database Error in model tbl_purchase_order_scd (models/generated/airbyte_incremental/scd/airbyte_bigquery/mytablename_scd.sql
)
k
A message from kapa.ai
To locate and customize the appropriate dbt file used for normalization by Airbyte, you can follow these steps: 1. Export the dbt normalization project outside Airbyte by following the instructions in this [guide](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt#exporting-dbt-normalization-project-outside-airbyte). 2. Once you have exported the project, you can find the generated dbt files in the
normalize
directory. The path should look like this:
tutorial/normalization-files/normalize/
. 3. Locate the specific dbt file for the model you want to customize (e.g.,
mytablename_scd.sql
). 4. Edit the dbt file to add your custom CASE conditions to handle the invalid timestamp issue. 5. Push your custom dbt transformation project back to Airbyte by following the instructions in this [guide](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-airbyte). Please note that if you make changes to the connection (e.g., adding or removing tables), you'll need to repeat the "Export the generated dbt code from Airbyte" step to get the updated code for your new tables, as mentioned in this [tutorial](https://airbyte.com/tutorials/bigquery-partition-cluster). Keep in mind that the provided information is based on Airbyte version 0.23.0-alpha from May 2021, and there might be some differences in the latest version (0.44.3). However, the general process should still be applicable.
q
I followed first part of the guide, exporting dbt model, but for the past 5min there is no output. See the attached image
a
Hey Qamarudeen! I can't help exactly with this current question, but wanted to bring your attention to this upcoming release https://github.com/airbytehq/airbyte/issues/26028 which may help with your specific issue of normalization failing due to particular rows with bad data. Feel free to sign up for the upcoming beta if you want to learn more.