sumit_singh
04/02/2025, 5:42 PMmeltano install --clean
, previously didn't get this error.
2025-04-02T17:35:09.960498Z [info ] Environment 'dev' is active
2025-04-02T17:35:10.699018Z [warning ] Certificate did not match expected hostname: <http://sp.meltano.com|sp.meltano.com>. Certificate: {'subject': ((('commonName', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', 'Amazon'),), (('commonName', 'Amazon RSA 2048 M02'),)), 'version': 3, 'serialNumber': 'xxxxxxxxxxx', 'notBefore': 'Mar 7 00:00:00 2025 GMT', 'notAfter': 'Apr 5 23:59:59 2026 GMT', 'subjectAltName': (('DNS', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),), 'OCSP': ('<http://ocsp.r2m02.amazontrust.com>',), 'caIssuers': ('<http://crt.r2m02.amazontrust.com/r2m02.cer',>), 'crlDistributionPoints': ('<http://crl.r2m02.amazontrust.com/r2m02.crl',)>}
Installing 4 plugins...
2025-04-02T17:35:11.553489Z [warning ] Certificate did not match expected hostname: <http://sp.meltano.com|sp.meltano.com>. Certificate: {'subject': ((('commonName', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', 'Amazon'),), (('commonName', 'Amazon RSA 2048 M02'),)), 'version': 3, 'serialNumber': 'xxxxxxxxxxxxxx', 'notBefore': 'Mar 7 00:00:00 2025 GMT', 'notAfter': 'Apr 5 23:59:59 2026 GMT', 'subjectAltName': (('DNS', '*.<http://ops.snowcatcloud.com|ops.snowcatcloud.com>'),), 'OCSP': ('<http://ocsp.r2m02.amazontrust.com>',), 'caIssuers': ('<http://crt.r2m02.amazontrust.com/r2m02.cer',>), 'crlDistributionPoints': ('<http://crl.r2m02.amazontrust.com/r2m02.crl',)>}
Installing extractor 'tap-mysql'...
Installing loader 'target-snowflake'...
Installing orchestrator 'airflow'...
Installing file bundle 'files-airflow'...
Michal Ondras
04/08/2025, 10:09 PM"progress_markers": {"Note": "Progress is not resumable if interrupted.",...
Chandana S
04/09/2025, 10:03 AMBuddy Ruddy
04/10/2025, 5:55 PMmeltano add files files-airflow
Then when I try to add the dag to Airflow it complains about DAG import error with:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/subprocess.py", line 1026, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/local/lib/python3.11/subprocess.py", line 1955, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'meltano'
Buddy Ruddy
04/10/2025, 6:02 PMBuddy Ruddy
04/10/2025, 6:04 PMHowever, you can also create your own Airflow DAGs for any pipeline you fancy by usingI think THIS is the part i'm missing but i still dont quite understand itwith the `meltano elt` command, orBashOperator
with a project-specific Docker image.DockerOperator
Buddy Ruddy
04/10/2025, 6:22 PMBuddy Ruddy
04/10/2025, 8:05 PMBuddy Ruddy
04/10/2025, 9:10 PMSuyash Muley
04/14/2025, 5:49 AM2025-04-14T05:40:34.883063Z [debug ] Meltano 3.6.0, Python 3.10.9, Windows (AMD64)
2025-04-14T05:40:34.887124Z [debug ] Looking up time zone info from registry
2025-04-14T05:40:34.892488Z [info ] Environment 'test' is active
2025-04-14T05:40:34.910936Z [debug ] Creating DB engine for project at 'C:\\Users\\Lenovo\\Desktop\\GDC\\Tap\\POC' with DB URI 'sqlite:/C:\\Users\\Lenovo\\Desktop\\GDC\\Tap\\POC\\.meltano/meltano.db'
2025-04-14T05:40:35.020965Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.020965Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.020965Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.021974Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.021974Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.021974Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.022965Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.022965Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.103032Z [debug ] Skipped installing extractor 'poc'
2025-04-14T05:40:35.104023Z [debug ] Skipped installing 1/1 plugins
2025-04-14T05:40:35.199362Z [debug ] Created configuration at C:\Users\Lenovo\Desktop\GDC\Tap\POC\.meltano\run\poc\tap.67806009-f627-42b6-af6d-488d63f7a914.config.json
2025-04-14T05:40:35.200261Z [debug ] Could not find tap.properties.json in C:\Users\Lenovo\Desktop\GDC\POC\.meltano\extractors\poc\tap.properties.json, skipping.
2025-04-14T05:40:35.201309Z [debug ] Could not find tap.properties.cache_key in C:\Users\Lenovo\Desktop\GDC\POC\.meltano\extractors\poc\tap.properties.cache_key, skipping.
2025-04-14T05:40:35.201309Z [debug ] Could not find state.json in C:\Users\Lenovo\Desktop\GDC\Tap\POC\.meltano\extractors\poc\state.json, skipping.
2025-04-14T05:40:35.203359Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.203359Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.203359Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.203359Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.204261Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.204261Z [debug ] Variable '$G' is not set in the provided env dictionary.
2025-04-14T05:40:35.204261Z [debug ] Variable '$P' is not set in the provided env dictionary.
2025-04-14T05:40:35.204261Z [debug ] Variable '$G' is not set in the provided env dictionary.
I have added Host, User, Password, DB, Port to the meltano.yml file.Suyash Musale
04/14/2025, 3:59 PM2025-04-14 09:23:39,166 - hotglue - DEBUG - Export hierarchy not set. Using original order
04/14 02:53:39 pm
2025-04-14 09:23:39,166 - hotglue - INFO - Fetching available entities
04/14 02:53:39 pm
2025-04-14 09:23:39,682 - hotglue - DEBUG - Available entity found: None
04/14 02:53:39 pm
2025-04-14 09:23:39,683 - hotglue - DEBUG - Listing folders [hotglue-prod.admin] /env_zips/tap-csv
04/14 02:53:39 pm
2025-04-14 09:23:39,803 - hotglue - DEBUG - Downloading: hotglue-prod.admin:env_zips/tap-csv/tap_csv-v0_1_7.py3_10.zip
04/14 02:53:39 pm
2025-04-14 09:23:39,949 - hotglue - INFO - Unzipping file tap-csv.zip
04/14 02:53:40 pm
2025-04-14 09:23:40,936 - hotglue - INFO - Running Subprocess: [ /home/envs/tap-csv/bin/tap-csv --config csv-config.json ] with path [ ['/home', '/usr/local/lib/python310.zip', '/usr/local/lib/python3.10', '/usr/local/lib/python3.10/lib-dynload', '/usr/local/lib/python3.10/site-packages'] ]
04/14 02:53:41 pm
2025-04-14 09:23:41,305 - hotglue.export - DEBUG - INFO Starting sync
04/14 02:53:41 pm
2025-04-14 09:23:41,305 - hotglue.export - DEBUG - INFO Sync completed
04/14 02:53:41 pm
2025-04-14 09:23:41,326 - hotglue - INFO - Running Subprocess: [ source /home/envs/target/bin/activate && cat streams.json | target-mssqltarget --config config.json > target_state.json ] with path [ ['/home', '/usr/local/lib/python310.zip', '/usr/local/lib/python3.10', '/usr/local/lib/python3.10/lib-dynload', '/usr/local/lib/python3.10/site-packages'] ]
04/14 02:53:42 pm
2025-04-14 09:23:42,938 - hotglue - INFO - (update_metadata) Started
04/14 02:53:43 pm
2025-04-14 09:23:43,028 - hotglue - INFO - (update_resources_usage_script) Started
04/14 02:53:43 pm
2025-04-14 09:23:43,311 - hotglue.export - DEBUG - 2025-04-14 09:23:43,310 | INFO | target-mssqltarget | Target 'target-mssqltarget' is listening for input from tap.
04/14 02:53:43 pm
2025-04-14 09:23:43,311 - hotglue.export - DEBUG - 2025-04-14 09:23:43,311 | INFO | target-mssqltarget | Target 'target-mssqltarget' completed reading 0 lines of input (0 schemas, 0 records, 0 batch manifests, 0 state messages).
04/14 02:53:43 pm
2025-04-14 09:23:43,312 - hotglue.export - DEBUG - 2025-04-14 09:23:43,312 | INFO | target-mssqltarget | Emitting completed target state {}
Emre Üstündağ
04/17/2025, 12:11 PMGordon Klundt
04/22/2025, 10:27 PMsinger_sdk.exceptions.InvalidStreamSortException: Unsorted data detected in stream. Latest value '2025-04-16T11:54:35.060000+00:00' is smaller than previous max '2025-04-16T11:54:35.063000+00:00
It looks like an issue on the constraint or the stream sorter that doesn't honor microseconds in the mssql datetime datatype.alex
04/24/2025, 4:10 PM"""Stream type classes for tap-tableau-metadata."""
from typing import Any, Dict, Optional, Union, List, Iterable
from singer_sdk import typing as th
from singer_sdk.streams import RESTStream
#from tap_tableau.client import TableauStream
import tableauserverclient as TSC
class TableauStream(RESTStream):
"""Tableau stream class."""
url_base = None
def get_server_client(self):
tableau_auth = TSC.PersonalAccessTokenAuth(self.config['personal_access_token_name'], self.config['personal_access_token_secret'], self.config.get('site_url_id'))
server_client = TSC.Server(self.config['server_url'], self.config['api_version']) if self.config.get('api_version') else TSC.Server(self.config['server_url'], use_server_version=True)
return tableau_auth, server_client
class DatasourcesStream(TableauStream):
name = "datasources"
primary_keys = ["id"]
replication_key = None
schema = th.PropertiesList(
th.Property("id", th.StringType),
# ...
).to_dict()
def get_records(self, context: Optional[dict]) -> Iterable[dict]:
"""Return a generator of row-type dictionary objects."""
tableau_auth, server_client = self.get_server_client()
with server_client.auth.sign_in(tableau_auth):
for datasource in TSC.Pager(server_client.datasources):
server_client.datasources.populate_connections(datasource)
server_client.datasources.populate_permissions(datasource)
row = {
'id': datasource.id,
#...
}
yield row
Note: The same issue occurs when relying on the tableau tap in the Meltano Hub from GtheSheep (https://hub.meltano.com/extractors/tap-tableau/)
I am running:
Meltano = 3.6.0
Python = 3.11
tableauserverclient = 0.25 (the latest version 0.37 does not resolve this issue)
singer-sdk = "~=0.34.0"
Chad Bell
04/25/2025, 5:59 PMMichal Ondras
04/26/2025, 10:43 PMAndy Carter
04/28/2025, 9:53 AMsince
and until
, so those exist as params to my query, but they are not present in the API response.
Is there any way to reliably access the params sent in a post-query method like post_process
or get_records
? Or would I need to persist those values to instance variables (self.since, self.until etc) on each iteration?Michal Ondras
04/28/2025, 3:30 PMv0.12.0
to v0.16.3
-> what unfortunately resulted is the target trying to change variants to varchars. any idea why?Ian OLeary
04/29/2025, 1:40 PMrepository.py
or my meltano.yml
. Was AssetExecutionContext deprecated or something? I tried switching to using OpExecutionContext but then started getting the same recursive error message for DagsterDbtTranslator
Matt Menzenski
04/29/2025, 5:22 PMKevin Phan
04/29/2025, 7:50 PM2025-04-29T19:27:10.966335Z [info ] time=2025-04-29 15:27:10 name=target_snowflake level=INFO message=Loading into talos_meltano_dev."BALANCES": {"inserts": 0, "updates": 8005, "size_bytes": 115101} cmd_type=elb consumer=True job_name=dev:tap-talos-to-target-snowflake-talos name=target-snowflake-talos producer=False run_id=a49d7e97-c5f6-4f9c-8c97-f7e60c7f9d66 stdio=stderr string_id=target-snowflake-talos
the inserts is 0 which is correct because nothing changed from the last run but i see updates is 8005 = number of rows. It is updating the sdc batched at
and other sdc metadata fields. My question is, does meltano automatically updates these metadata columns regardless of whether a record changes or needs to be updated? Techically if it is incremental and no data is changed in the record, then it shouldn't touch it right?Matt Menzenski
04/29/2025, 11:02 PMrun
command completes, we call a Snowflake stored procedure that routes the newly loaded records from this big landing table into the collection-specific tables (in database-specific schemas - a record from the payment_service
database’s`Transaction` collection lands in a payment_service.transaction
table in Snowflake), and then truncates the big table. This works, but it means data written during a run doesn’t surface in the collection-specific tables (which is where dbt sources are defined) until the job completes. During periods of high platform activity, our meltano jobs take longer and this delay is problematic.
We have already increased our target-snowflake max batch size, changed from MERGE to COPY (append-only writes), and changed from JSONL to Parquet format for stage files. These changes have helped but not by a ton.
Options we have thought of:
• split our one change stream into more than one to gain some parallelization
◦ “database names starting with A-M / database names starting with N-Z” or similar for a naive two-way split
◦ Key off the first character in a UUID field in the document itself (all our documents have a UUID id
) to split the change stream records into sixteen streams of (approximately) equal size
◦ Define Meltano stream maps for all the database+collection names we know of (so that those records can be split into per-collection streams and landed directly into the correct destination table rather than having to go through the routing process). Use the current routing procedure for unhandled (novel) records only. (I think we could do this by splitting records into streams with names like <schema>-<table>
per this). We wouldn’t be able to handle everything dynamically because it’s hard. I’m concerned about the overhead of having hundreds of streams in one job if we went this route.
• Move away from target-snowflake - perhaps write to S3 ourselves (target-s3?) and use Snowpipe to load data into Snowflake from S3 (we run in AWS GovCloud but our Snowflake account is in AWS Commercial - wondering if it’d be more performant to move the “write to stage” step into GovCloud)
Any other ideas we should be thinking about? It doesn’t seem like there are any good ways to do this 😕Ashish Bhat
04/29/2025, 11:28 PMNathan Sooter
05/02/2025, 5:40 PMmeltano run tap-salesforce target-csv --full-refresh
give me a CSV with every row for a given object when there are multiple batches?Chandana S
05/07/2025, 8:20 AM2025-05-06T11:49:43.936442Z [info ] Backing off 8.99 seconds after 3 tries calling function <bound method RESTStream._request of <tap_shopify.streams.AbandonedCheckouts object at 0x7f17392cfb50>> with args (<PreparedRequest [GET]>, None) and kwargs {} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:49:52.951378Z [info ] Backing off _request(...) for 16.6s (requests.exceptions.SSLError: HTTPSConnectionPool(host='<http://triumph-sg.myshopify.com|triumph-sg.myshopify.com>', port=443): Max retries exceeded with url: /admin/api/2023-10/checkouts.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')))) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:49:52.951885Z [info ] Backing off 16.64 seconds after 4 tries calling function <bound method RESTStream._request of <tap_shopify.streams.AbandonedCheckouts object at 0x7f17392cfb50>> with args (<PreparedRequest [GET]>, None) and kwargs {} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.658595Z [info ] Giving up _request(...) after 5 tries (requests.exceptions.SSLError: HTTPSConnectionPool(host='<http://triumph-sg.myshopify.com|triumph-sg.myshopify.com>', port=443): Max retries exceeded with url: /admin/api/2023-10/checkouts.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')))) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.660324Z [info ] METRIC: {"type": "counter", "metric": "http_request_count", "value": 0, "tags": {"stream": "abandoned_checkouts", "endpoint": "/checkouts.json"}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.661527Z [info ] METRIC: {"type": "timer", "metric": "sync_duration", "value": 33.077449560165405, "tags": {"stream": "abandoned_checkouts", "context": {}, "status": "failed"}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.662679Z [info ] METRIC: {"type": "counter", "metric": "record_count", "value": 0, "tags": {"stream": "abandoned_checkouts", "context": {}}} cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.663754Z [info ] An unhandled error occurred while syncing 'abandoned_checkouts' cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.664790Z [info ] Traceback (most recent call last): cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.665754Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 716, in urlopen cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.666726Z [info ] httplib_response = self._make_request( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.667769Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 404, in _make_request cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.668799Z [info ] self._validate_conn(conn) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.669838Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1061, in _validate_conn cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.673645Z [info ] conn.connect() cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.674254Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/connection.py", line 419, in connect cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.674991Z [info ] self.sock = ssl_wrap_socket( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.675908Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 458, in ssl_wrap_socket cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.676620Z [info ] ssl_sock = _ssl_wrap_socket_impl( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.677561Z [info ] File "/app/.meltano/extractors/tap-shopify/venv/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 502, in _ssl_wrap_socket_impl cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.678446Z [info ] return ssl_context.wrap_socket(sock, server_hostname=server_hostname) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.679288Z [info ] File "/usr/local/lib/python3.10/ssl.py", line 513, in wrap_socket cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.679617Z [info ] return self.sslsocket_class._create( cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.679994Z [info ] File "/usr/local/lib/python3.10/ssl.py", line 1071, in _create cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.680335Z [info ] self.do_handshake() cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.680723Z [info ] File "/usr/local/lib/python3.10/ssl.py", line 1342, in do_handshake cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.681206Z [info ] self._sslobj.do_handshake() cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
2025-05-06T11:50:09.681679Z [info ] ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) cmd_type=elb consumer=False job_name=dev:tap-shopify-to-target-jsonl name=tap-shopify producer=True run_id=e28fde8d-bbf2-44d3-a8ac-088b2049364c stdio=stderr string_id=tap-shopify
There is a certificate added to the https url we are using and I am not sure why there is this ssl certificate issue now as for probably a month and a half it was working fine.Michal Ondras
05/08/2025, 1:52 AMsinger_sdk.exceptions.FatalAPIError: ("Graphql error: [{'message': 'Something went wrong while executing your query on 2025-05-07T22:26:39Z. Please include `xx....` when reporting this issue.'}]", <Response [200]>)
thanksKevin Phan
05/08/2025, 12:29 PM- name: decrypt_metadata_mapper
config:
stream_maps:
public-user_details:
decrypted_metadata: >
__import__('transform.utils.decrypt_metadata', fromlist=['decrypt_metadata']).decrypt_metadata(
record['metadata_encrypted'], record['salt']
)
salt: __NULL__
Running tap-core-db-source decrypt_metadata_mapper target-snowflake-core-db-source
I am getting some import errors. Is this the correct way to do this or am i doing an antipattern? What is the pattern for trying to run a python script in flight ? I am trying to decrypt some columns between the tap and target. Is a custom utility a better way ?Justin Yang
05/08/2025, 2:12 PMmeltano run tap-postgres target-redshift
I get the error:
redshift_connector.error.InterfaceError: {'S': 'FATAL', 'C': '28000', 'M': 'no pg_hba.conf entry for host "???", user "wave", database "scylla", SSL off', 'F': '/opt/brazil-pkg-cache/packages/RedshiftPADB/RedshiftPADB-1.0.12086.0/AL2_x86_64/generic-flavor/src/src/pg/src/backend/libpq/auth.c', 'L': '477', 'R': 'ClientAuthentication'} cmd_type=elb consumer=True job_name=scylla:tap-postgres-to-target-redshift name=target-redshift producer=False run_id=d990a55c-30b3-41b2-8bc0-2e20a200ee56 stdio=stderr string_id=target-redshift
I was able to confirm that I can run psql commands and query redshift so the network connection is ok. I wonder if there is something specific to the loader that I need to configure. The last relevant logging is:
Found credentials from IAM Role: Wave__DataEC2Role
so I wonder if its an IAM role issue.joshua_janicas
05/08/2025, 4:13 PMTanner Wilcox
05/09/2025, 10:10 PM