https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • s

    sumit_singh

    04/02/2025, 5:42 PM
    Installing forked versions of tap and target...recently getting this in log while running
    meltano install --clean
    , previously didn't get this error.
    Copy code
    2025-04-02T17:35:09.960498Z [info     ] Environment 'dev' is active
    2025-04-02T17:35:10.699018Z [warning  ] Certificate did not match expected hostname: <http://sp.meltano.com|sp.meltano.com>. Certificate: {'subject': ((('commonName', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', 'Amazon'),), (('commonName', 'Amazon RSA 2048 M02'),)), 'version': 3, 'serialNumber': 'xxxxxxxxxxx', 'notBefore': 'Mar  7 00:00:00 2025 GMT', 'notAfter': 'Apr  5 23:59:59 2026 GMT', 'subjectAltName': (('DNS', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),), 'OCSP': ('<http://ocsp.r2m02.amazontrust.com>',), 'caIssuers': ('<http://crt.r2m02.amazontrust.com/r2m02.cer',>), 'crlDistributionPoints': ('<http://crl.r2m02.amazontrust.com/r2m02.crl',)>}
    Installing 4 plugins...
    2025-04-02T17:35:11.553489Z [warning  ] Certificate did not match expected hostname: <http://sp.meltano.com|sp.meltano.com>. Certificate: {'subject': ((('commonName', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', 'Amazon'),), (('commonName', 'Amazon RSA 2048 M02'),)), 'version': 3, 'serialNumber': 'xxxxxxxxxxxxxx', 'notBefore': 'Mar  7 00:00:00 2025 GMT', 'notAfter': 'Apr  5 23:59:59 2026 GMT', 'subjectAltName': (('DNS', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),), 'OCSP': ('<http://ocsp.r2m02.amazontrust.com>',), 'caIssuers': ('<http://crt.r2m02.amazontrust.com/r2m02.cer',>), 'crlDistributionPoints': ('<http://crl.r2m02.amazontrust.com/r2m02.crl',)>}
    Installing extractor 'tap-mysql'...
    Installing loader 'target-snowflake'...
    Installing orchestrator 'airflow'...
    Installing file bundle 'files-airflow'...
    ✅ 1
    p
    e
    • 3
    • 21
  • m

    Michal Ondras

    04/08/2025, 10:09 PM
    hi what is this ?
    Copy code
    "progress_markers": {"Note": "Progress is not resumable if interrupted.",...
    e
    • 2
    • 12
  • c

    Chandana S

    04/09/2025, 10:03 AM
    Hi All, I added tap-googleads as an extractor and I see that we need the oauth credentials for authentication. I have used the service account way where we get a credentials.json file. I wanted to know if there is a way we can use this for authentication instead of the oauth way. I know the json file authentication is available in google analytics where we can mention the location of file. Is something similar available for google ads too?
    a
    r
    • 3
    • 5
  • b

    Buddy Ruddy

    04/10/2025, 5:55 PM
    Hello, I'm struggling with understanding how to incorporate the meltano dag into an existing Airflow installation. it feels as though meltano doesn't give much thought towards integrating with airflow (or maybe other orchestrators) in a managed or k8s environment. I'm trying to accomplish this without having to write a custom dag since meltano can generate an airflow one, but that seems to be with the condition that your airflow and meltano binaries are on the same host. Is there something maybe i'm missing about the generator or is my hunch correct that this isn't really intended to be separate components? For context, we are running this:
    meltano add files files-airflow
    Then when I try to add the dag to Airflow it complains about DAG import error with:
    Copy code
    Traceback (most recent call last):
      File "/usr/local/lib/python3.11/subprocess.py", line 1026, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/local/lib/python3.11/subprocess.py", line 1955, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: 'meltano'
    e
    • 2
    • 6
  • b

    Buddy Ruddy

    04/10/2025, 6:02 PM
    I think this is because meltano is using the python code to generate DAGs directly into Airflow? Is that correct?
    • 1
    • 1
  • b

    Buddy Ruddy

    04/10/2025, 6:04 PM
    However, you can also create your own Airflow DAGs for any pipeline you fancy by using
    BashOperator
    with the `meltano elt` command, or
    DockerOperator
    with a project-specific Docker image.
    I think THIS is the part i'm missing but i still dont quite understand it
  • b

    Buddy Ruddy

    04/10/2025, 6:22 PM
    I feel like im very close, but it's mainly the DAG import that is the issue. The Airflow web server won't have the meltano bin, and when it imports the meltano dag generator it simply fails. I'm expecting my setup to be able to schedule my dag on a pod that leverages a meltano image to run the task, so the fact that this setup attempts to use meltano to create the schedules feels a bit short-sighted. too much dependency on the meltano cli imo
  • b

    Buddy Ruddy

    04/10/2025, 8:05 PM
    at this point, im trying to add my meltano image as a sidecar to the airflow deployment, hoping to be able to map the bin to that
  • b

    Buddy Ruddy

    04/10/2025, 9:10 PM
    yea that becomes far more complicated than i want or need this to be. why is meltano so self-centered?
  • s

    Suyash Muley

    04/14/2025, 5:49 AM
    Hello everyone I've been trying out a meltano extractor for mssql by forking a current one and I seem to be facing an issue while connecting to the db
    Copy code
    2025-04-14T05:40:34.883063Z [debug    ] Meltano 3.6.0, Python 3.10.9, Windows (AMD64)
    2025-04-14T05:40:34.887124Z [debug    ] Looking up time zone info from registry
    2025-04-14T05:40:34.892488Z [info     ] Environment 'test' is active  
    2025-04-14T05:40:34.910936Z [debug    ] Creating DB engine for project at 'C:\\Users\\Lenovo\\Desktop\\GDC\\Tap\\POC' with DB URI 'sqlite:/C:\\Users\\Lenovo\\Desktop\\GDC\\Tap\\POC\\.meltano/meltano.db'
    2025-04-14T05:40:35.020965Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.020965Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.020965Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.021974Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.021974Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.021974Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.022965Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.022965Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.103032Z [debug    ] Skipped installing extractor 'poc'
    2025-04-14T05:40:35.104023Z [debug    ] Skipped installing 1/1 plugins
    2025-04-14T05:40:35.199362Z [debug    ] Created configuration at C:\Users\Lenovo\Desktop\GDC\Tap\POC\.meltano\run\poc\tap.67806009-f627-42b6-af6d-488d63f7a914.config.json
    2025-04-14T05:40:35.200261Z [debug    ] Could not find tap.properties.json in C:\Users\Lenovo\Desktop\GDC\POC\.meltano\extractors\poc\tap.properties.json, skipping.
    2025-04-14T05:40:35.201309Z [debug    ] Could not find tap.properties.cache_key in C:\Users\Lenovo\Desktop\GDC\POC\.meltano\extractors\poc\tap.properties.cache_key, skipping.
    2025-04-14T05:40:35.201309Z [debug    ] Could not find state.json in C:\Users\Lenovo\Desktop\GDC\Tap\POC\.meltano\extractors\poc\state.json, skipping.
    2025-04-14T05:40:35.203359Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.203359Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.203359Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.203359Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.204261Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.204261Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    2025-04-14T05:40:35.204261Z [debug    ] Variable '$P' is not set in the provided env dictionary.
    2025-04-14T05:40:35.204261Z [debug    ] Variable '$G' is not set in the provided env dictionary.
    I have added Host, User, Password, DB, Port to the meltano.yml file.
    r
    e
    • 3
    • 9
  • s

    Suyash Musale

    04/14/2025, 3:59 PM
    Hello everyone! I have built a tap for REST type of stream and API authentication. Created a job in with target as default mssql target and added the credentials for the same, When the job runs, I am able to see the data streams on the console the connection is also established but no records are transferred. Here is what the console displays:
    Copy code
    2025-04-14 09:23:39,166 - hotglue - DEBUG - Export hierarchy not set. Using original order
    
    04/14 02:53:39 pm
    
    2025-04-14 09:23:39,166 - hotglue - INFO - Fetching available entities
    
    04/14 02:53:39 pm
    
    2025-04-14 09:23:39,682 - hotglue - DEBUG - Available entity found: None
    
    04/14 02:53:39 pm
    
    2025-04-14 09:23:39,683 - hotglue - DEBUG - Listing folders [hotglue-prod.admin] /env_zips/tap-csv
    
    04/14 02:53:39 pm
    
    2025-04-14 09:23:39,803 - hotglue - DEBUG - Downloading: hotglue-prod.admin:env_zips/tap-csv/tap_csv-v0_1_7.py3_10.zip
    
    04/14 02:53:39 pm
    
    2025-04-14 09:23:39,949 - hotglue - INFO - Unzipping file tap-csv.zip
    
    04/14 02:53:40 pm
    
    2025-04-14 09:23:40,936 - hotglue - INFO - Running Subprocess: [ /home/envs/tap-csv/bin/tap-csv --config csv-config.json ] with path [ ['/home', '/usr/local/lib/python310.zip', '/usr/local/lib/python3.10', '/usr/local/lib/python3.10/lib-dynload', '/usr/local/lib/python3.10/site-packages'] ]
    
    04/14 02:53:41 pm
    
    2025-04-14 09:23:41,305 - hotglue.export - DEBUG - INFO Starting sync
    
    04/14 02:53:41 pm
    
    2025-04-14 09:23:41,305 - hotglue.export - DEBUG - INFO Sync completed
    
    04/14 02:53:41 pm
    
    2025-04-14 09:23:41,326 - hotglue - INFO - Running Subprocess: [ source /home/envs/target/bin/activate && cat streams.json | target-mssqltarget --config config.json > target_state.json ] with path [ ['/home', '/usr/local/lib/python310.zip', '/usr/local/lib/python3.10', '/usr/local/lib/python3.10/lib-dynload', '/usr/local/lib/python3.10/site-packages'] ]
    
    04/14 02:53:42 pm
    
    2025-04-14 09:23:42,938 - hotglue - INFO - (update_metadata) Started
    
    04/14 02:53:43 pm
    
    2025-04-14 09:23:43,028 - hotglue - INFO - (update_resources_usage_script) Started
    
    04/14 02:53:43 pm
    
    2025-04-14 09:23:43,311 - hotglue.export - DEBUG - 2025-04-14 09:23:43,310 | INFO     | target-mssqltarget   | Target 'target-mssqltarget' is listening for input from tap.
    
    04/14 02:53:43 pm
    
    2025-04-14 09:23:43,311 - hotglue.export - DEBUG - 2025-04-14 09:23:43,311 | INFO     | target-mssqltarget   | Target 'target-mssqltarget' completed reading 0 lines of input (0 schemas, 0 records, 0 batch manifests, 0 state messages).
    
    04/14 02:53:43 pm
    
    2025-04-14 09:23:43,312 - hotglue.export - DEBUG - 2025-04-14 09:23:43,312 | INFO     | target-mssqltarget   | Emitting completed target state {}
    e
    s
    • 3
    • 4
  • e

    Emre Üstündağ

    04/17/2025, 12:11 PM
    Hi everyone. A simple question. Can I add a new field to a stream before loading it? I already tried stream_maps config in the tap like in the screenshot. But I got "...raise MapExpressionError(singer_sdk.exceptions.MapExpressionError: Failed to evaluate simpleeval expressions datetime.datetime.now().""" all the time. I also tested the example in the SDK documentation that is in here: https://sdk.meltano.com/en/latest/stream_maps.html#add-a-property-with-a-string-literal-value but the issue still persists like "... Failed to evaluate simpleeval expressions client-123". More, there are other errors in the terminal like "...raise NameNotDefined(node.id, self.expr)singer_sdk.helpers._simpleeval.NameNotDefined: 'datetime' is not defined for expression 'datetime.datetime.now()'" . Did you ever face the issue like that? Is it applicable?
    ✅ 1
    e
    • 2
    • 6
  • g

    Gordon Klundt

    04/22/2025, 10:27 PM
    Having an issue with bookmark comparison on a sql tap/target.
    Copy code
    singer_sdk.exceptions.InvalidStreamSortException: Unsorted data detected in stream. Latest value '2025-04-16T11:54:35.060000+00:00' is smaller than previous max '2025-04-16T11:54:35.063000+00:00
    It looks like an issue on the constraint or the stream sorter that doesn't honor microseconds in the mssql datetime datatype.
    e
    • 2
    • 2
  • a

    alex

    04/24/2025, 4:10 PM
    Hey all, I am wondering if Meltano is parallelizing the requests in my tap stream (or making them async) . And if this is the case, if it is possible to disable this. This is the context for my question: I am pulling metadata from the rest api of my tableau server. When accessing some of the tableau sites (the bigger ones) my tap fails within 1 or 2 seconds. The logs show that it was able to successfully authenticate and to sign in but shortly after it raises "401002: Unauthorized Access; Invalid authentication credentials were provided". Why do I think this issue might be related to Meltano? Because 1. the same code and credentials and package versions work outside of Meltano and 2. When looking into the Tableau api error code I stumbled upon this Issue discussing problems with async requests https://github.com/tableau/server-client-python/issues/1342 This is my code:
    Copy code
    """Stream type classes for tap-tableau-metadata."""
    
    from typing import Any, Dict, Optional, Union, List, Iterable
    from singer_sdk import typing as th
    from singer_sdk.streams import RESTStream
    
    #from tap_tableau.client import TableauStream
    
    import tableauserverclient as TSC
    
    class TableauStream(RESTStream):
        """Tableau stream class."""
    
        url_base = None
    
        def get_server_client(self):
    
            tableau_auth = TSC.PersonalAccessTokenAuth(self.config['personal_access_token_name'], self.config['personal_access_token_secret'], self.config.get('site_url_id'))
            server_client = TSC.Server(self.config['server_url'], self.config['api_version']) if self.config.get('api_version') else TSC.Server(self.config['server_url'], use_server_version=True)
    
            return tableau_auth, server_client
    
    
    class DatasourcesStream(TableauStream):
        name = "datasources"
        primary_keys = ["id"]
        replication_key = None
        schema = th.PropertiesList(
            th.Property("id", th.StringType),
            # ...
        ).to_dict()
    
        def get_records(self, context: Optional[dict]) -> Iterable[dict]:
            """Return a generator of row-type dictionary objects."""
    
            tableau_auth, server_client = self.get_server_client()
    
            with server_client.auth.sign_in(tableau_auth):
                for datasource in TSC.Pager(server_client.datasources):
                    server_client.datasources.populate_connections(datasource)
                    server_client.datasources.populate_permissions(datasource)
                    row = {
                        'id': datasource.id,
                        #...
                    }
                    yield row
    Note: The same issue occurs when relying on the tableau tap in the Meltano Hub from GtheSheep (https://hub.meltano.com/extractors/tap-tableau/) I am running:
    Copy code
    Meltano = 3.6.0 
    Python = 3.11
    tableauserverclient = 0.25 (the latest version 0.37 does not resolve this issue)
    singer-sdk = "~=0.34.0"
    e
    • 2
    • 2
  • c

    Chad Bell

    04/25/2025, 5:59 PM
    Hey there! This is probably a simple fix, but what is the recommended approach for propagating tap schema changes to target? We are extracting from Cloud SQL and going to BigQuery. Our tap recognizes new fields added in Cloud SQL, but no new columns are added in BigQuery
    t
    e
    • 3
    • 4
  • m

    Michal Ondras

    04/26/2025, 10:43 PM
    Anybody else get this certs error on the target-snowflake?
    ✅ 1
    r
    e
    l
    • 4
    • 8
  • a

    Andy Carter

    04/28/2025, 9:53 AM
    Any suggestion on how to handle an API which has date based pagination using
    since
    and
    until
    , so those exist as params to my query, but they are not present in the API response. Is there any way to reliably access the params sent in a post-query method like
    post_process
    or
    get_records
    ? Or would I need to persist those values to instance variables (self.since, self.until etc) on each iteration?
    r
    • 2
    • 2
  • m

    Michal Ondras

    04/28/2025, 3:30 PM
    hi, as a result of certs issue over the weekend, i though i would also upgrade snowflake-target from
    v0.12.0
    to
    v0.16.3
    -> what unfortunately resulted is the target trying to change variants to varchars. any idea why?
    ✅ 1
    e
    j
    • 3
    • 13
  • i

    Ian OLeary

    04/29/2025, 1:40 PM
    I randomly started getting this yesterday with dagster. Didn't change anything with my dagster
    repository.py
    or my
    meltano.yml
    . Was AssetExecutionContext deprecated or something? I tried switching to using OpExecutionContext but then started getting the same recursive error message for
    DagsterDbtTranslator
    👀 1
    • 1
    • 1
  • m

    Matt Menzenski

    04/29/2025, 5:22 PM
    The meltano logging documentation includes some examples of “custom” / non-standard loggers but they both just use standard formatters with no additional dependencies. Is it possible to provide meltano a custom log handler that uses its own additional Python dependency? (And if so, is this something that would be done by just adding an extra dependency to the virtual environment Meltano uses, or is this something that could be packaged as a Meltano “utility” plugin?). (Context: I am trying to update the dagster-meltano library to support Dagster 1.10, more details here.)
    👀 1
    e
    • 2
    • 2
  • k

    Kevin Phan

    04/29/2025, 7:50 PM
    Quick question, I have a custom tap that pulls balance data from talos. i have it as incremental based on an updated at field from the API itself. When i do the incremental run to snowflake, /i am seeing
    Copy code
    2025-04-29T19:27:10.966335Z [info     ] time=2025-04-29 15:27:10 name=target_snowflake level=INFO message=Loading into talos_meltano_dev."BALANCES": {"inserts": 0, "updates": 8005, "size_bytes": 115101} cmd_type=elb consumer=True job_name=dev:tap-talos-to-target-snowflake-talos name=target-snowflake-talos producer=False run_id=a49d7e97-c5f6-4f9c-8c97-f7e60c7f9d66 stdio=stderr string_id=target-snowflake-talos
    the inserts is 0 which is correct because nothing changed from the last run but i see updates is 8005 = number of rows. It is updating the
    sdc batched at
    and other sdc metadata fields. My question is, does meltano automatically updates these metadata columns regardless of whether a record changes or needs to be updated? Techically if it is incremental and no data is changed in the record, then it shouldn't touch it right?
    e
    • 2
    • 1
  • m

    Matt Menzenski

    04/29/2025, 11:02 PM
    How do I improve Meltano performance in a “fan-out” scenario? I’m specifically interested in tap-mongodb+target-snowflake performance. I am hoping that there is a way for me to 10x my current throughput. I am using tap-mongodb (private fork) to open a MongoDB change stream against the cluster - this change stream includes data from ~150 databases and ~800 collections. I need to get this data into a Snowflake table per collection. The single MongoDB change stream (scoped to the entire cluster) is great because it limits impact on the source cluster, it makes the meltano operations simple, it makes meltano state management simple, but it makes this sort of fanout hard. Today, we land all cluster data into a single big Snowflake table during the Meltano job run. Once the Meltano
    run
    command completes, we call a Snowflake stored procedure that routes the newly loaded records from this big landing table into the collection-specific tables (in database-specific schemas - a record from the
    payment_service
    database’s`Transaction` collection lands in a
    payment_service.transaction
    table in Snowflake), and then truncates the big table. This works, but it means data written during a run doesn’t surface in the collection-specific tables (which is where dbt sources are defined) until the job completes. During periods of high platform activity, our meltano jobs take longer and this delay is problematic. We have already increased our target-snowflake max batch size, changed from MERGE to COPY (append-only writes), and changed from JSONL to Parquet format for stage files. These changes have helped but not by a ton. Options we have thought of: • split our one change stream into more than one to gain some parallelization ◦ “database names starting with A-M / database names starting with N-Z” or similar for a naive two-way split ◦ Key off the first character in a UUID field in the document itself (all our documents have a UUID
    id
    ) to split the change stream records into sixteen streams of (approximately) equal size ◦ Define Meltano stream maps for all the database+collection names we know of (so that those records can be split into per-collection streams and landed directly into the correct destination table rather than having to go through the routing process). Use the current routing procedure for unhandled (novel) records only. (I think we could do this by splitting records into streams with names like
    <schema>-<table>
    per this). We wouldn’t be able to handle everything dynamically because it’s hard. I’m concerned about the overhead of having hundreds of streams in one job if we went this route. • Move away from target-snowflake - perhaps write to S3 ourselves (target-s3?) and use Snowpipe to load data into Snowflake from S3 (we run in AWS GovCloud but our Snowflake account is in AWS Commercial - wondering if it’d be more performant to move the “write to stage” step into GovCloud) Any other ideas we should be thinking about? It doesn’t seem like there are any good ways to do this 😕
  • a

    Ashish Bhat

    04/29/2025, 11:28 PM
    Is target-s3 installation broken for anyone ? There's some circular dep issue between these packages : aiobotocore, s3fs, idna
    e
    • 2
    • 2
  • n

    Nathan Sooter

    05/02/2025, 5:40 PM
    I'm building a pretty large job pulling all records out of various tables in SFDC. I am doing full-refresh for now. I've looked for this answer in a few places but haven't figured it out -- Expected behavior: • Batches of rows from Salesforce.Account get appended, so that at the end of the day I get 1 CSV with every row from SFDC Account Actual Behavior: • Each batch from Account overwrites the file, so my final .csv output is simply the final batch, and I lose all the other records What is the method to have
    meltano run tap-salesforce target-csv --full-refresh
    give me a CSV with every row for a given object when there are multiple batches?
    ✅ 1
    e
    • 2
    • 12
  • c

    Chandana S

    05/07/2025, 8:20 AM
    Hi All, I have been using Shopify extractor for almost 3 months now and I use the command to run the tap and target within my python code based on some triggers and other factors. This code of mine is hosted on AKS (Azure Kubernetes Service) and for the past 2 - 3 days I have been getting this error.
    Copy code
    2025-05-06T11:49:43.936442Z [info     ] Backing off 8.99 seconds after 3 tries calling function <bound method RESTStream._request of <tap_shopify.streams.AbandonedCheckouts object at 0x7f17392cfb50>> with args (<PreparedRequest [GET]>, None) and kwargs {} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:49:52.951378Z [info     ] Backing off _request(...) for 16.6s (requests.exceptions.SSLError: HTTPSConnectionPool(host='<http://triumph-sg.myshopify.com|triumph-sg.myshopify.com>', port=443): Max retries exceeded with url: /admin/api/2023-10/checkouts.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')))) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:49:52.951885Z [info     ] Backing off 16.64 seconds after 4 tries calling function <bound method RESTStream._request of <tap_shopify.streams.AbandonedCheckouts object at 0x7f17392cfb50>> with args (<PreparedRequest [GET]>, None) and kwargs {} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.658595Z [info     ] Giving up _request(...) after 5 tries (requests.exceptions.SSLError: HTTPSConnectionPool(host='<http://triumph-sg.myshopify.com|triumph-sg.myshopify.com>', port=443): Max retries exceeded with url: /admin/api/2023-10/checkouts.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')))) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.660324Z [info     ] METRIC: {"type": "counter", "metric": "http_request_count", "value": 0, "tags": {"stream": "abandoned_checkouts", "endpoint": "/checkouts.json"}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.661527Z [info     ] METRIC: {"type": "timer", "metric": "sync_duration", "value": 33.077449560165405, "tags": {"stream": "abandoned_checkouts", "context": {}, "status": "failed"}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.662679Z [info     ] METRIC: {"type": "counter", "metric": "record_count", "value": 0, "tags": {"stream": "abandoned_checkouts", "context": {}}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.663754Z [info     ] An unhandled error occurred while syncing 'abandoned_checkouts' cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.664790Z [info     ] Traceback (most recent call last): cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.665754Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 716, in urlopen cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.666726Z [info     ]     httplib_response = self._make_request( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.667769Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 404, in _make_request cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.668799Z [info     ]     self._validate_conn(conn)  cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.669838Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1061, in _validate_conn cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.673645Z [info     ]     conn.connect()             cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.674254Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connection.py", line 419, in connect cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.674991Z [info     ]     self.sock = ssl_wrap_socket( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.675908Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 458, in ssl_wrap_socket cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.676620Z [info     ]     ssl_sock = _ssl_wrap_socket_impl( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.677561Z [info     ]   File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 502, in _ssl_wrap_socket_impl cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.678446Z [info     ]     return ssl_context.wrap_socket(sock, server_hostname=server_hostname) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.679288Z [info     ]   File "/usr/local/lib/python3.10/ssl.py", line 513, in wrap_socket cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.679617Z [info     ]     return self.sslsocket_class._create( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.679994Z [info     ]   File "/usr/local/lib/python3.10/ssl.py", line 1071, in _create cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.680335Z [info     ]     self.do_handshake()        cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.680723Z [info     ]   File "/usr/local/lib/python3.10/ssl.py", line 1342, in do_handshake cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.681206Z [info     ]     self._sslobj.do_handshake() cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    2025-05-06T11:50:09.681679Z [info     ] ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
    There is a certificate added to the https url we are using and I am not sure why there is this ssl certificate issue now as for probably a month and a half it was working fine.
    ✅ 1
    e
    r
    • 3
    • 4
  • m

    Michal Ondras

    05/08/2025, 1:52 AM
    anybody else get this error on tap-github?
    Copy code
    singer_sdk.exceptions.FatalAPIError: ("Graphql error: [{'message': 'Something went wrong while executing your query on 2025-05-07T22:26:39Z. Please include `xx....` when reporting this issue.'}]", <Response [200]>)
    thanks
    👀 1
  • k

    Kevin Phan

    05/08/2025, 12:29 PM
    hey folks, trying to run a python script to decrypt columns from postgres via a mapper like so
    Copy code
    - name: decrypt_metadata_mapper
            config:
              stream_maps:
                public-user_details:
                  decrypted_metadata: >
                    __import__('transform.utils.decrypt_metadata', fromlist=['decrypt_metadata']).decrypt_metadata(
                      record['metadata_encrypted'], record['salt']
                    )
                  salt: __NULL__
    Running
    tap-core-db-source decrypt_metadata_mapper target-snowflake-core-db-source
    I am getting some import errors. Is this the correct way to do this or am i doing an antipattern? What is the pattern for trying to run a python script in flight ? I am trying to decrypt some columns between the tap and target. Is a custom utility a better way ?
    e
    • 2
    • 5
  • j

    Justin Yang

    05/08/2025, 2:12 PM
    hello all, I’m trying to have a meltano ELT pipeline tapping Postgres and targeting Redshift. When I run
    meltano run tap-postgres target-redshift
    I get the error:
    Copy code
    redshift_connector.error.InterfaceError: {'S': 'FATAL', 'C': '28000', 'M': 'no pg_hba.conf entry for host "???", user "wave", database "scylla", SSL off', 'F': '/opt/brazil-pkg-cache/packages/RedshiftPADB/RedshiftPADB-1.0.12086.0/AL2_x86_64/generic-flavor/src/src/pg/src/backend/libpq/auth.c', 'L': '477', 'R': 'ClientAuthentication'} cmd_type=elb consumer=True job_name=scylla:tap-postgres-to-target-redshift name=target-redshift producer=False run_id=d990a55c-30b3-41b2-8bc0-2e20a200ee56 stdio=stderr string_id=target-redshift
    I was able to confirm that I can run psql commands and query redshift so the network connection is ok. I wonder if there is something specific to the loader that I need to configure. The last relevant logging is:
    Copy code
    Found credentials from IAM Role: Wave__DataEC2Role
    so I wonder if its an IAM role issue.
    👀 1
    ✅ 1
    e
    • 2
    • 23
  • j

    joshua_janicas

    05/08/2025, 4:13 PM
    Hellos, this is a follow up to a problem I asked about back in Dec 2024 about a "schema only" flag where meltano just creates an empty table with column definition and doesn't update state. https://github.com/meltano/sdk/issues/2810 I've found a bit of a workaround (I think) based on the comments in the link from visch. The thing is I can't use the * catch all for stream maps because it's using all the tables it found instead of the ones I'm filtering on. That just means I have to duplicate the stream_maps for every single table I need. A nuisance but I can get over it unless there's another way to handle the * properly?
  • t

    Tanner Wilcox

    05/09/2025, 10:10 PM
    I prefer my yml list items indented. Every time I run a meltano command that modifies a yml all of my ymls get un-indented. Can I turn that off somehow?
    r
    e
    • 3
    • 3