https://linen.dev logo
Join Slack
Powered by
# plugins-dbt
  • p

    pablo_azofeifa

    01/26/2024, 3:36 PM
    Hello everybody! I tried to use a model for dbt-postgres (is a pipeline that use tap-csv and target-postgres) but I get a syntax error in VS Code. I use the models of the dbt docs (https://docs.getdbt.com/docs/build/incremental-models) like an example, but doesn't works. What's could be the problem? This is the model that I made to use:
    Copy code
    {{
        config(materialized = 'incremental', unique_key = 'id', incremental_strategy = 'delete+insert', merge_update_columns = ['firstname', 'lastname','email','profession'])
    }}
    
    
    with source as (
        select * from Employees
    )
    
    select * from source
    e
    • 2
    • 2
  • m

    Mindaugas Nižauskas

    01/30/2024, 5:44 AM
    Hi, I would like to contribute and create dbt-athena. Created a PR, but not sure if that is enough 🤷. Could someone from Meltano / Arch 👀? Thank you
    ✅ 1
    e
    • 2
    • 1
  • p

    pablo_azofeifa

    02/08/2024, 9:50 PM
    Hello everybody! I try to use a dbt-bigquery, but I get this error (attached image). The model that I try to use is that:
    Copy code
    WITH updated_data AS (
      SELECT
        *
      FROM Employees
      
    )
    
    SELECT
      *
    FROM "projectid.dataset.Employees"
    UNION ALL
    SELECT * FROM updated_data
    I did search the problem and I found that a possible option is to change "" for `` but doesn't work. What can I change or what could be the problem?
    e
    • 2
    • 1
  • i

    Ian

    02/20/2024, 8:27 PM
    When you guys are developing some model in snowflake or any other warehouse, do you start by getting your query logic fully figured out in your data warehouse interface (like ssms/dbeaver/whatever) and then just copy over that code to your dbt model? or do you do everything through the dbt command line?
    m
    • 2
    • 1
  • i

    Ian OLeary

    03/01/2024, 6:33 PM
    Is there a way to specify the
    target
    directory location for dbt-snowflake to use for the manifest.json it creates? It's making it in .meltano/transformers/dbt/target/manifest.json right now. Should I be using that dir? Or make one in $MELTANO_PROJECT_ROOT/transform/target along with my dbt_project.yml, models, etc...
    j
    • 2
    • 10
  • i

    Ian OLeary

    04/23/2024, 7:57 PM
    Copy code
    AttributeError: 'str' object has no attribute 'items'
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster/_core/execution/plan/utils.py", line 54, in op_execution_error_boundary
        yield
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster/_utils/__init__.py", line 443, in iterate_with_context
        next_output = next(iterator)
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster_dbt/asset_defs.py", line 290, in _dbt_op
        dbt_resource.remove_run_results_json()
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster_dbt/cli/resources.py", line 462, in remove_run_results_json
        project_dir = kwargs.get("project_dir", self.default_flags["project-dir"])
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster_dbt/cli/resources.py", line 176, in default_flags
        return self._format_params(self._default_flags, replace_underscores=True)
      File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dagster_dbt/dbt_resource.py", line 33, in _format_params
        flags = {k.replace("_", "-"): v for k, v in flags.items() if v is not None}
    Has anyone run into this error before with dagster-dbt?
    a
    • 2
    • 15
  • s

    sean_han

    05/09/2024, 10:32 PM
    I am trying to use a second dbt-postgres to transform the data to a different database, but I got the following error:
    Copy code
    024-05-09T22:24:59.363374Z [info     ] Environment 'local' is active
    Extension executing `dbt clean`...
    Traceback (most recent call last):
      File "/Users/seanhan/Workspaces/data_platform/meltano/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/extension.py", line 122, in pass_through_invoker
        self.pre_invoke(None, *command_args)
      File "/Users/seanhan/Workspaces/data_platform/meltano/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/dbt_ext/extension.py", line 71, in pre_invoke
        self.dbt_invoker.run_and_log("clean")
      File "/Users/seanhan/Workspaces/data_platform/meltano/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 172, in run_and_log
        result = asyncio.run(self._exec(sub_command, *args))
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
        return future.result()
      File "/Users/seanhan/Workspaces/data_platform/meltano/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 126, in _exec
        p = await asyncio.create_subprocess_exec(
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/subprocess.py", line 236, in create_subprocess_exec
        transport, protocol = await loop.subprocess_exec(
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/base_events.py", line 1676, in subprocess_exec
        transport = await self._make_subprocess_transport(
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/unix_events.py", line 197, in _make_subprocess_transport
        transp = _UnixSubprocessTransport(self, protocol, args, shell,
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/base_subprocess.py", line 36, in __init__
        self._start(args=args, shell=shell, stdin=stdin, stdout=stdout,
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/asyncio/unix_events.py", line 789, in _start
        self._proc = subprocess.Popen(
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/subprocess.py", line 951, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/Users/seanhan/.asdf/installs/python/3.9.15/lib/python3.9/subprocess.py", line 1821, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: 'dbt'
    pre_invoke failed with uncaught exception, please report to maintainer
    Here is my mletano.yml:
    Copy code
    utilities:
      - name: dbt-postgres
        variant: dbt-labs
        pip_url: dbt-core~=1.3.0 dbt-postgres~=1.3.0 git+<https://github.com/meltano/dbt-ext.git@main>
        config:
          host: ${APP_DB_HOST}
          user: ${APP_DB_USER_NAME}
          port: 5432
          dbname: ${APP_DB_NAME}
          schema: public
      - name: dbt-postgres-second
        inherit_from: dbt-postgres
        config:
          host: ${SECOND_DB_HOST}
          user: ${SECOND_DB_USER_NAME}
          port: 5432
          dbname: ${SECOND_DB_NAME}
          schema: analytics
    The first one is working fine, but the second one fails. does anyone know how to fix the issue?
    a
    • 2
    • 6
  • i

    Ian OLeary

    05/23/2024, 2:35 PM
    Copy code
    dagster_dbt.errors.DagsterDbtCliRuntimeError: The dbt CLI process with command
    
    `dbt build --select fqn:*`
    
    failed with exit code `2`. Check the stdout in the Dagster compute logs for the full information about the error, or view the dbt debug log: /project/transform/target/dbt_project_assets-4a67f79-32186f0/dbt.log.
    
    Errors parsed from dbt logs:
    
    Encountered an error:
    'dbt_<snowflake://macros/apply_grants.sql>'
    I'm getting this error trying to materialize my dbt assets within dagster, has anyone run into this issue before? Can post the stack trace in the thread if needed Sidenote: I'm able to run the
    meltano invoke dbt-snowflake:run
    command successfully inside the container after doing a
    docker exec -it <container> /bin/bash
    a
    • 2
    • 4
  • j

    Jean-Philippe

    06/18/2024, 8:29 AM
    Hello, Impossible to install dbt-postgres utility :
    Copy code
    Utility 'dbt-postgres' could not be installed: Failed to install plugin 'dbt-postgres'.
    Collecting git+<https://github.com/meltano/dbt-ext.git@main>
      Cloning <https://github.com/meltano/dbt-ext.git> (to revision main) to /tmp/pip-req-build-bxr10h5s
      Running command git clone --filter=blob:none --quiet <https://github.com/meltano/dbt-ext.git> /tmp/pip-req-build-bxr10h5s
      Resolved <https://github.com/meltano/dbt-ext.git> to commit 2e55a1e00bf3611978a5253fef92ef62b85788fe
      Installing build dependencies: started
      Installing build dependencies: finished with status 'done'
      Getting requirements to build wheel: started
      Getting requirements to build wheel: finished with status 'done'
      Preparing metadata (pyproject.toml): started
      Preparing metadata (pyproject.toml): finished with status 'done'
    Collecting dbt-core
      Downloading dbt_core-1.8.2-py3-none-any.whl.metadata (3.9 kB)
    Collecting dbt-postgres
      Downloading dbt_postgres-1.8.1-py3-none-any.whl.metadata (3.4 kB)
    Collecting agate<1.10,>=1.7.0 (from dbt-core)
      Downloading agate-1.9.1-py2.py3-none-any.whl.metadata (3.2 kB)
    Collecting Jinja2<4,>=3.1.3 (from dbt-core)
      Downloading jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB)
    Collecting mashumaro<4.0,>=3.9 (from mashumaro[msgpack]<4.0,>=3.9->dbt-core)
      Downloading mashumaro-3.13.1-py3-none-any.whl.metadata (114 kB)
         ââââââââââââââââââââââââââââââââââââââââ 114.3/114.3 kB 6.1 MB/s eta 0:00:00
    Collecting logbook<1.6,>=1.5 (from dbt-core)
      Downloading Logbook-1.5.3.tar.gz (85 kB)
         ââââââââââââââââââââââââââââââââââââââââ 85.8/85.8 kB 98.3 MB/s eta 0:00:00
      Preparing metadata (setup.py): started
      Preparing metadata (setup.py): finished with status 'done'
    Collecting click<9.0,>=8.0.2 (from dbt-core)
      Downloading click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
    Collecting networkx<4.0,>=2.3 (from dbt-core)
      Downloading networkx-3.2.1-py3-none-any.whl.metadata (5.2 kB)
    Collecting protobuf<5,>=4.0.0 (from dbt-core)
      Downloading protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)
    Collecting requests<3.0.0 (from dbt-core)
      Downloading requests-2.32.3-py3-none-any.whl.metadata (4.6 kB)
    Collecting pathspec<0.13,>=0.9 (from dbt-core)
      Downloading pathspec-0.12.1-py3-none-any.whl.metadata (21 kB)
    Collecting sqlparse<0.6.0,>=0.5.0 (from dbt-core)
      Downloading sqlparse-0.5.0-py3-none-any.whl.metadata (3.9 kB)
    Collecting dbt-extractor<=0.6,>=0.5.0 (from dbt-core)
      Downloading dbt_extractor-0.5.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.2 kB)
    Collecting minimal-snowplow-tracker<0.1,>=0.0.2 (from dbt-core)
      Downloading minimal-snowplow-tracker-0.0.2.tar.gz (12 kB)
      Preparing metadata (setup.py): started
      Preparing metadata (setup.py): finished with status 'done'
    Collecting dbt-semantic-interfaces<0.6,>=0.5.1 (from dbt-core)
      Downloading dbt_semantic_interfaces-0.5.1-py3-none-any.whl.metadata (2.6 kB)
    Collecting dbt-common<2.0,>=1.0.4 (from dbt-core)
      Downloading dbt_common-1.3.0-py3-none-any.whl.metadata (5.3 kB)
    Collecting dbt-adapters<2.0,>=1.1.1 (from dbt-core)
      Downloading dbt_adapters-1.2.1-py3-none-any.whl.metadata (2.5 kB)
    Collecting packaging>20.9 (from dbt-core)
      Downloading packaging-24.1-py3-none-any.whl.metadata (3.2 kB)
    Collecting pytz>=2015.7 (from dbt-core)
      Downloading pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
    Collecting pyyaml>=6.0 (from dbt-core)
      Downloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
    Collecting daff>=1.3.46 (from dbt-core)
      Downloading daff-1.3.46.tar.gz (149 kB)
         ââââââââââââââââââââââââââââââââââââââââ 149.8/149.8 kB 61.3 MB/s eta 0:00:00
      Preparing metadata (setup.py): started
      Preparing metadata (setup.py): finished with status 'done'
    Collecting typing-extensions>=4.4 (from dbt-core)
      Downloading typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB)
    Collecting psycopg2<3.0,>=2.9 (from dbt-postgres)
      Downloading psycopg2-2.9.9.tar.gz (384 kB)
         ââââââââââââââââââââââââââââââââââââââââ 384.9/384.9 kB 49.7 MB/s eta 0:00:00
      Preparing metadata (setup.py): started
      error: subprocess-exited-with-error
      
      Ã python setup.py egg_info did not run successfully.
      â exit code: 1
      â°â> See above for output.
      
      note: This error originates from a subprocess, and is likely not a problem with pip.
      Preparing metadata (setup.py): finished with status 'error'
    error: metadata-generation-failed
    
    Ã Encountered error while generating package metadata.
    â°â> See above for output.
    
    note: This is an issue with the package mentioned above, not pip.
    hint: See above for details.
    
    Need help fixing this problem? Visit <http://melta.no/> for troubleshooting steps, or to
    join our friendly Slack community.
    
    Failed to install plugin(s)
    r
    e
    • 3
    • 15
  • s

    Sandhje

    06/19/2024, 1:41 PM
    Hi, I have a small question... I'm new to meltano and trying to set up an ELT pipeline to extract data from snowflake and load it into an Azure hosted MSSQL database and then do some transformations on it using DBT. How can I best run my DBT transformations from Meltano since there is no dbt-mssql utility?
  • e

    Edgar Ramírez (Arch.dev)

    07/29/2024, 7:00 PM
    This is probably more applicable for the dbt Slack community but I'm going to try here anyways. I'm confused about the models not needing to be declared in the
    .yml
    with schemas etc. Seems that if I just drop my
    .sql
    files in the
    models
    folder the views are created. So far so good. I want to integrate it with Metabase, and it seems
    dbt-metabase
    needs the models to be properly declared in the YaML files... But I wonder; why introduce this source of inconsistency? My `SELECT`s may very well not have the same schema as whatever I declare in
    models.yml
    . To me the query should be enough declaration in itself, shouldn't it? Thread in Slack Conversation
  • m

    Mindaugas Nižauskas

    08/21/2024, 11:57 AM
    Hi, help me understand better. If https://github.com/potloc/elementary-ext has been archived, it means that it is no longer actively maintained and though we could update the status in https://github.com/meltano/hub/blob/main/_data/meltano/utilities/elementary/elementary.yml?
    e
    • 2
    • 1
  • i

    Ian OLeary

    09/09/2024, 2:04 PM
    Hello, I seem to be getting this error:
    Encountered an error: Parsing Error Env var required but not provided: 'DBT_SNOWFLAKE_ROLE'
    only when running in docker. I dont have the env var set in my .env but I have the role set in my dbt-snowflake config. Do I still need this env var set anyway? I'm able to invoke dbt-snowflake:run locally so I'm confused why it's failing once I run it in docker
    Copy code
    environments:
    - name: dev
      config:
        plugins:
          loaders:
          - name: target-snowflake
            config:
              database: DEV
          utilities:
          - name: dbt-snowflake
            config:
              database: DEV
    plugins:
      utilities:
      - name: dbt-snowflake
        variant: dbt-labs
        pip_url: dbt-core==1.8.0 dbt-snowflake==1.8.0 git+<https://github.com/meltano/dbt-ext.git@v0.1.0>
        config:
          role: ACCOUNTADMIN
          schema: DBO
          warehouse: MELTANO_WH
    e
    • 2
    • 2
  • j

    James Stratford

    09/18/2024, 2:54 AM
    Can a Meltano project have multiple dbt projects in /transform? I ask because the current structure I am going with is multiple taps per container as we keep the code and config together on a per client basis Not very experienced with dbt, but my idea is to have seperate repos for the models for each of our taps, and cloning those into /transform
    v
    • 2
    • 1
  • h

    Hoang Minh Bui

    09/18/2024, 4:28 PM
    Hi team, currently when I synced my table from mysql to snowflake, I had this error,
    sqlalchemy.exc.ProgrammingError: (snowflake.connector.errors.ProgrammingError) 100069 (22P02): 01b71bf6-0002-e150-0000-026162eef2b6: Error parsing JSON: document is too large, max size 16777216 bytes
    . Is there a way for me to ignore this record and move on with other rows? Thanks in advance for the help
    e
    • 2
    • 4
  • d

    dylan_just

    10/17/2024, 11:56 AM
    Hi all. I noticed Meltano Hub doesn't have a transformer or utility for dbt with clickhouse. How easy is it to add this?
    v
    e
    h
    • 4
    • 5
  • j

    Johan Vrolix

    10/19/2024, 11:11 AM
    ANyone can help with this?
    Copy code
    meltano invoke dbt-postgres deps                            
    2024-10-19T11:07:41.326402Z [info     ] Environment 'dev' is active   
    Extension executing `dbt clean`...
    11:07:42  Running with dbt=1.8.7
    11:07:42  Encountered an error:
    Runtime Error
      dbt will not clean the following directories outside the project: ['/meltano/demo-dwh/.meltano/transformers/dbt/target']
    
    
    error invoking dbt clean       error_message=pre invoke step of `dbt clean` failed returncode=2
    e
    • 2
    • 1
  • m

    Marc Panzirsch

    10/22/2024, 2:20 PM
    Hi all, with the help of the Slack channel I stumpled upon some multi purpose taps: https://hub.meltano.com/extractors/tap-rest-api-msdk https://hub.meltano.com/extractors/tap-spreadsheets-anywhere Are there any other multipurpose plugins one should know?
    r
    • 2
    • 2
  • a

    Alexander Shabunevich

    10/24/2024, 11:29 AM
    Hi, wondering if someone going to add dbt-fabric utility to MeltanoHub?
  • j

    Johan Vrolix

    10/24/2024, 11:42 AM
    Can you configure the dbt transformer somehow that it doesn’t prepend the schema set in meltano.yml to every schema?
    e
    m
    • 3
    • 12
  • a

    Andres Felipe Huertas Suarez

    11/27/2024, 7:26 AM
    Hi all, I have a question about `dbt-duckdb`: I have my meltano enviroment where I pull data from a tap and use the
    target-parquet
    to save it as parquet files, all well there. Then I want to bring dbt into the process, so I add it to the project (I'm new to dbt as well) and then I create my sources file:
    Copy code
    sources:
      - name: awin-local
        meta:
          external_location: "{{ env_var('OUTPUT_PARQUET', 'data') }}/{name}-20241126_145148-0-0.gz.parquet"
        tables:
          - name: transactions
    the OUTPUT_PARQUET is coming from the meltano.yml (I hope) so I expect this external source to point to my parquet file, I want to run queries against it using dbt
    Copy code
    environments:
    - name: dev
      env:
        OUTPUT_PARQUET: $MELTANO_PROJECT_ROOT/output/awin-parquet/transactions/
    - name: staging
    - name: prod
    Now I try a simple model/all_data.sql
    Copy code
    with trns as (
        select networkFee__amount from {{ source('awin-local', 'transactions') }}
    )
    
    select * from trns
    and then I compile (
    uvx meltano invoke dbt-duckdb compile
    ) to see the SQL and I get the following query
    Copy code
    with trns as (
        select networkFee__amount from "main"."awin-local"."transactions"
    )
    
    select * from trns
    and this fails to run (
    invoke dbt-duckdb run
    ), what I'm expecting to see is something more like
    Copy code
    with trns as (
        select networkFee__amount from "$MELTANO_PROJECT_ROOT/output/awin-parquet/transactions/{name}-20241126_145148-0-0.gz.parquet"
    )
    
    select * from trns
    Based on my definition of the source and so that I can run queries against my parquet file. But I can't, and I've been unable to figure it out, any ideas? Thank you very much for your help 🙂
    e
    • 2
    • 5
  • r

    Ruddy Gunawan

    12/02/2024, 11:36 PM
    Hello everybody, I am struggling to run
    tap-outbrain
    . According to the documentation on meltano hub web page, I suppose to configure it with this command:
    Copy code
    meltano config tap-outbrain set --interactive
    I set up everything, except access_token (since it's optional). But then, when I try to test it with this command:
    Copy code
    meltano config tap-outbrain test
    I got this error message:
    Copy code
    Plugin configuration is invalid
    Catalog discovery failed: command ['/app/meltano/outbrain-1/.meltano/extractors/tap-outbrain/venv/bin/tap-outbrain', '--config', '/app/meltano/outbrain-1/.meltano/run/tap-outbrain/tap.bed32650-3905-46c1-ae64-8172c5783a5f.config.json', '--discover'] returned 2 with stderr:
     usage: tap-outbrain [-h] -c CONFIG [-s STATE]
    tap-outbrain: error: unrecognized arguments: --discover
    This is the generated meltano.yml file after setting up the configuration (project id and username below are modified, everything else is just copy-pasted directly):
    Copy code
    version: 1
    default_environment: dev
    project_id: [generated project id here]
    environments:
    - name: dev
    - name: staging
    - name: prod
    plugins:
      extractors:
      - name: tap-outbrain
        variant: dbt-labs
        pip_url: git+<https://github.com/dbt-labs/tap-outbrain.git>
        config:
          start_date: '2024-08-08'
          username: [i set my username via the above command earlier]
    I wonder what went wrong. I use the same steps for
    tap-taboola
    and they worked normally, but struggling to get past this testing for
    tap-outbrain
    . Also, I need to change my Dockerfile to use python 3.8, can't use python 3.10 with
    tap-outbrain
    , otherwise will get some library issues.
    e
    • 2
    • 2
  • i

    Ian OLeary

    12/10/2024, 5:50 PM
    my local dbt-snowflake install keeps getting hung up but it's installing fine in my container. what should I do?
    e
    • 2
    • 46
  • j

    jacob_mulligan

    12/17/2024, 1:16 PM
    Is there a "target" folder somewhere for dbt models? Where you can see the actual compiled code being run against snowflake? Not sure if it exists or if meltano handles thing differently than a standard dbt deployment
    ✅ 1
    v
    • 2
    • 2
  • a

    Abhishek Singh

    12/30/2024, 4:22 PM
    is there a plugin or utilities that can be integrated for dbt-mysql and dbt-mssql ? or do we have to use it as a separate pipeline step?
    v
    • 2
    • 1
  • a

    Abhishek Singh

    12/30/2024, 4:22 PM
    since there is no option for it in the metlano hub
  • a

    Abhishek Singh

    01/19/2025, 2:53 AM
    dbt-bigquery has a bug when running meltano invoke dbt-bigquery:initialize it creates a profiles.yaml file which is not detected by default when using meltano invoke dbt-bigquery:run If i rename it to profiles.yml only then it starts to work even after that the profiles.yml contains the below code
    Copy code
    config:
      send_anonymous_usage_stats: False
      use_colors: True
    Which is no longer supported in latest version of DBT and same already available in the dbt_project.yml file after removing above section only then dbt-bigquery starts to work
    ✅ 1
    e
    • 2
    • 2
  • d

    dbname

    03/04/2025, 2:41 AM
    Hello, I am running into an issue when trying to inherit the dbt-postgres utility. My goal is to inherit it so I can change the target database being set in the config. Maybe there is a more elegant way to do this but I feel like this should work.
    Copy code
    plugins:
      utilities:
        - name: dbt-postgres
          variant: dbt-labs
          pip_url: dbt-core dbt-postgres meltano-dbt-ext~=0.3.0
        - name: test-dbt-postgres
          inherit_from: dbt-postgres
          config:
            database: test
    e
    • 2
    • 1
  • d

    dbname

    03/04/2025, 2:41 AM
    Copy code
    meltano invoke test-dbt-postgres:test                                                                                                                                                                    1 ↵
    2025-03-04T02:36:04.900102Z [info     ] Environment 'dev' is active
    Extension executing `dbt test`...
    Traceback (most recent call last):
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/extension.py", line 137, in pass_through_invoker
        self.invoke(None, *command_args)
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/dbt_ext/extension.py", line 102, in invoke
        self.dbt_invoker.run_and_log(command_name, *command_args)
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 184, in run_and_log
        result = asyncio.run(self._exec(sub_command, *args))
      File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
        return future.result()
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 129, in _exec
        p = await asyncio.create_subprocess_exec(
      File "/usr/local/lib/python3.9/asyncio/subprocess.py", line 236, in create_subprocess_exec
        transport, protocol = await loop.subprocess_exec(
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1676, in subprocess_exec
        transport = await self._make_subprocess_transport(
      File "/usr/local/lib/python3.9/asyncio/unix_events.py", line 197, in _make_subprocess_transport
        transp = _UnixSubprocessTransport(self, protocol, args, shell,
      File "/usr/local/lib/python3.9/asyncio/base_subprocess.py", line 36, in __init__
        self._start(args=args, shell=shell, stdin=stdin, stdout=stdout,
      File "/usr/local/lib/python3.9/asyncio/unix_events.py", line 789, in _start
        self._proc = subprocess.Popen(
      File "/usr/local/lib/python3.9/subprocess.py", line 951, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/local/lib/python3.9/subprocess.py", line 1837, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: 'dbt'
    invoke failed with uncaught exception, please report to maintainer
    Traceback (most recent call last):
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/extension.py", line 137, in pass_through_invoker
        self.invoke(None, *command_args)
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/dbt_ext/extension.py", line 102, in invoke
        self.dbt_invoker.run_and_log(command_name, *command_args)
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 184, in run_and_log
        result = asyncio.run(self._exec(sub_command, *args))
      File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
        return future.result()
      File "/project/.meltano/utilities/dbt-postgres/venv/lib/python3.9/site-packages/meltano/edk/process.py", line 129, in _exec
        p = await asyncio.create_subprocess_exec(
      File "/usr/local/lib/python3.9/asyncio/subprocess.py", line 236, in create_subprocess_exec
        transport, protocol = await loop.subprocess_exec(
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1676, in subprocess_exec
        transport = await self._make_subprocess_transport(
      File "/usr/local/lib/python3.9/asyncio/unix_events.py", line 197, in _make_subprocess_transport
        transp = _UnixSubprocessTransport(self, protocol, args, shell,
      File "/usr/local/lib/python3.9/asyncio/base_subprocess.py", line 36, in __init__
        self._start(args=args, shell=shell, stdin=stdin, stdout=stdout,
      File "/usr/local/lib/python3.9/asyncio/unix_events.py", line 789, in _start
        self._proc = subprocess.Popen(
      File "/usr/local/lib/python3.9/subprocess.py", line 951, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/local/lib/python3.9/subprocess.py", line 1837, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: 'dbt'
  • s

    Steven Searcy

    05/09/2025, 3:05 PM
    Running into some issues using the
    dbt-postgres
    utility, when attempting incremental updates on a table. Maybe this is standard behavior, as I am fairly new here, but it seems to be changing the schema of the table. I am attempting to load data into a table managed outside of Meltano, is this not a good idea?
    ✅ 1
    e
    • 2
    • 9