Tanner Wilcox
05/09/2025, 10:36 PM{%- macro drop_schema() -%}
{%- set drop_query -%}
drop schema {{ target.schema }}
{%- endset -%}
{% do run_query(drop_query) %}
{%- endmacro -%}
I'm assuming I should be able to do something like this: mel run dbt:run-operation:drop_schema sonar warehouse but I get an error saying it can't find drop_schema. I have it in ./macros/ . I'm assuming I need to put it in my meltano.yml under my dbt transfromer section. Maybe it should go under utilities? I'm at a lossSiba Prasad Nayak
05/14/2025, 6:21 AM(venv) PS C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend>
meltano --log-level=debug invoke tap-sftp
2025-05-14T06:09:15.617690Z [warning ] Failed to create symlink to 'meltano.exe': administrator privilege required
2025-05-14T06:09:15.624021Z [debug ] Meltano 3.6.0, Python 3.12.3, Windows (AMD64)
2025-05-14T06:09:15.632565Z [debug ] Looking up time zone info from registry
2025-05-14T06:09:15.651188Z [info ] Environment 'dev' is active
2025-05-14T06:09:15.707195Z [debug ] Creating DB engine for project at 'C:\\Siba_\\Work\\POC_ConnectorFactory\\Gerrit\\Connector_Factory_Development\\meltano-backend' with DB URI 'sqlite:/C:\\Siba_\\Work\\POC_ConnectorFactory\\Gerrit\\Connector_Factory_Development\\meltano-backend\\.meltano/meltano.db'
2025-05-14T06:09:16.045430Z [debug ] Skipped installing extractor 'tap-sftp'
2025-05-14T06:09:16.046429Z [debug ] Skipped installing 1/1 plugins
2025-05-14T06:09:16.231095Z [debug ] Created configuration at C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\run\tap-sftp\tap.31d90bf6-66b2-4bc2-9d17-9305905bbcdf.config.json
2025-05-14T06:09:16.235112Z [debug ] Could not find tap.properties.json in C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\tap.properties.json, skipping.
2025-05-14T06:09:16.238126Z [debug ] Could not find tap.properties.cache_key in C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\tap.properties.cache_key, skipping.
2025-05-14T06:09:16.240124Z [debug ] Could not find state.json in C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\state.json, skipping.
2025-05-14T06:09:16.248129Z [debug ] Invoking: ['C:\\Siba_\\Work\\POC_ConnectorFactory\\Gerrit\\Connector_Factory_Development\\meltano-backend\\.meltano\\extractors\\tap-sftp\\venv\\Scripts\\tap-sftp.exe', '--config', 'C:\\Siba_\\Work\\POC_ConnectorFactory\\Gerrit\\Connector_Factory_Development\\meltano-backend\\.meltano\\run\\tap-sftp\\tap.31d90bf6-66b2-4bc2-9d17-9305905bbcdf.config.json']
C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\venv\Lib\site-packages\paramiko\pkey.py:59: CryptographyDeprecationWarning: TripleDES has been moved to cryptography.hazmat.decrepit.ciphers.algorithms.TripleDES and will be removed from cryptography.hazmat.primitives.ciphers.algorithms in 48.0.0.
"cipher": algorithms.TripleDES,
C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\venv\Lib\site-packages\paramiko\transport.py:219: CryptographyDeprecationWarning: Blowfish has been moved to cryptography.hazmat.decrepit.ciphers.algorithms.Blowfish and will be removed from cryptography.hazmat.primitives.ciphers.algorithms in 45.0.0.
"class": algorithms.Blowfish,
C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\extractors\tap-sftp\venv\Lib\site-packages\paramiko\transport.py:243: CryptographyDeprecationWarning: TripleDES has been moved to cryptography.hazmat.decrepit.ciphers.algorithms.TripleDES and will be removed from cryptography.hazmat.primitives.ciphers.algorithms in 48.0.0.
"class": algorithms.TripleDES,
2025-05-14T06:09:17.479363Z [debug ] Deleted configuration at C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend\.meltano\run\tap-sftp\tap.31d90bf6-66b2-4bc2-9d17-9305905bbcdf.config.json
(venv) PS C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend>
I have created a basic configuration in meltano.yml file.
- name: tap-sftp
namespace: tap_sftp
pip_url: ./connectors/tap-sftp
executable: tap-sftp
config:
host: 10.148.155.30
port: 22
username: ubuntu
start_date: 2025-05-13
private_key_file: bridgex.pem
tables:
- table_name: single_file_test
search_prefix: /home/ubuntu
search_pattern: 'wget-log'
(venv) PS C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend>
meltano config tap-sftp list
2025-05-14T06:16:25.653934Z [warning ] Failed to create symlink to 'meltano.exe': administrator privilege required
2025-05-14T06:16:25.677427Z [info ] The default environment 'dev' will be ignored for `meltano config`. To configure a specific environment, please use the option `--environment=<environment name>`.
Custom, possibly unsupported by the plugin:
host [env: TAP_SFTP_HOST] current value: '10.148.155.30' (from `meltano.yml`)
port [env: TAP_SFTP_PORT] current value: 22 (from `meltano.yml`)
username [env: TAP_SFTP_USERNAME] current value: 'ubuntu' (from `meltano.yml`)
start_date [env: TAP_SFTP_START_DATE] current value: '2025-05-13' (from `meltano.yml`)
private_key_file [env: TAP_SFTP_PRIVATE_KEY_FILE] current value: 'bridgex.pem' (from `meltano.yml`)
tables [env: TAP_SFTP_TABLES] current value: [{'table_name': 'single_file_test', 'search_prefix': '/home/ubuntu', 'search_pattern': 'wget-log'}] (from `meltano.yml`)
(venv) PS C:\Siba_\Work\POC_ConnectorFactory\Gerrit\Connector_Factory_Development\meltano-backend>
Can anyone please help.Andy Carter
05/21/2025, 12:13 PMKrisna Aditya
05/22/2025, 4:03 AMtappostgres can it be named something else?
I might come into scenario where I must replicate the same source to two separate destination while data migration is happening.
Thank you!Siba Prasad Nayak
05/23/2025, 10:38 AMparamiko.ssh_exception.SSHException: Incompatible ssh peer (no acceptable host key)
For this I made a change in the client.py
self.transport._preferred_keys = ('ssh-rsa', 'ecdsa-sha2-nistp256', 'ecdsa-sha2-nistp384', 'ecdsa-sha2-nistp521', 'ssh-ed25519', 'ssh-dss')
def __try_connect(self):
if not self.__active_connection:
try:
self.transport = paramiko.Transport((self.host, self.port))
self.transport.use_compression(True)
self.transport._preferred_keys = ('ssh-rsa', 'ecdsa-sha2-nistp256', 'ecdsa-sha2-nistp384', 'ecdsa-sha2-nistp521', 'ssh-ed25519', 'ssh-dss')
self.transport.connect(username = self.username, pkey = self.key)
self.sftp = paramiko.SFTPClient.from_transport(self.transport)
except (AuthenticationException, SSHException) as ex:
self.transport.close()
self.transport = paramiko.Transport((self.host, self.port))
self.transport.use_compression(True)
self.transport._preferred_keys = ('ssh-rsa', 'ecdsa-sha2-nistp256', 'ecdsa-sha2-nistp384', 'ecdsa-sha2-nistp521', 'ssh-ed25519', 'ssh-dss')
self.transport.connect(username= self.username, pkey = None)
self.sftp = paramiko.SFTPClient.from_transport(self.transport)
self.__active_connection = True
# get 'socket' to set the timeout
socket = self.sftp.get_channel()
# set request timeout
socket.settimeout(self.request_timeout)
Even after making this change, its not resolving the issue.Steven Searcy
05/30/2025, 8:58 PMpkl modules still considered a good practice here? I know there are some limitations to include_paths.Siba Prasad Nayak
06/06/2025, 5:14 AMBruno Arnabar
06/06/2025, 3:35 PMSiba Prasad Nayak
06/06/2025, 5:47 PMTanner Wilcox
06/10/2025, 9:07 PMSac
06/25/2025, 3:05 PMSteven Searcy
06/25/2025, 6:26 PMSiba Prasad Nayak
06/27/2025, 8:11 AMswitch statement to handle each connector individually, but that doesn't seem scalable.
Has anyone approached this problem before? I'd love to hear any recommendations or design patterns you'd suggest instead of a massive switch case. In any case if meltano offers something which can solve my issue ?
Thanks!Ellis Valentiner
07/11/2025, 2:34 PMmeltano.yml file? Specifically ours is very verbose and contains a lot of duplication. For instance, we have an inline stream map on every table to add a source database identifier. If we try to define that with a * to apply to all tables then it errors because it doesn't respect the contents of select. This means for each extractor we have the same table identified in the stream_maps, select, and metadata blocks. So we're constantly jumping around the yaml to make updates and its very easy for devs to miss 1 of the 3 places that need to be updated.Siba Prasad Nayak
07/15/2025, 4:18 PM- name: salesforce-discover
namespace: salesforce_discover
commands:
hello:
executable: echo
args: ["Hello World"]
verify:
executable: python
args: ["bin/salesforce_discover.py", "$CONFIG_PATH", "verify"]
The requirement is
I want to override the discovery output for salesforce instead of using the existing "--discover" functionality.
But when I am executing these commands, I am getting the error.
(sibaVenv) PS C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304> meltano invoke salesforce-discover:hello
2025-07-15T16:09:16.508600Z [warning ] Failed to create symlink to 'meltano.exe': administrator privilege required
2025-07-15T16:09:16.536909Z [info ] Environment 'dev' is active
Need help fixing this problem? Visit <http://melta.no/> for troubleshooting steps, or to
join our friendly Slack community.
'CommentedSeq' object has no attribute 'read'
(sibaVenv) PS C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304> meltano invoke salesforce-discover:verify
2025-07-15T16:17:59.700345Z [warning ] Failed to create symlink to 'meltano.exe': administrator privilege required
2025-07-15T16:17:59.743028Z [info ] Environment 'dev' is active
Need help fixing this problem? Visit <http://melta.no/> for troubleshooting steps, or to
join our friendly Slack community.
'CommentedSeq' object has no attribute 'read'
(sibaVenv) PS C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304>
From quick search on internet, it seems there is some issue with parsing the YAML file but not sure where exactly.
If anyone has any idea on this, can you please help.Siba Prasad Nayak
07/17/2025, 8:11 PMsinger_sdk.exceptions.RecordsWithoutSchemaException: A record for stream 'accounts' was encountered before a corresponding schema. Check that the Tap correctly implements the Singer spec.
Full Error Log:
$ dos2unix out_cleaned.jsonl
dos2unix: converting file out_cleaned.jsonl to Unix format...
(sibaVenv)
SiNayak@INBAWN172239 MINGW64 /c/Siba_/Work/LocalSetup/Backend/backend_05062025/backend_2304
$ cat out_cleaned.jsonl | meltano invoke target-jira
2025-07-17T20:07:23.746313Z [warning ] Failed to create symlink to 'meltano.exe': administrator privilege required
2025-07-17T20:07:23.761502Z [info ] Environment 'dev' is active
INFO:target-jira:target-jira v0.0.1, Meltano SDK v0.42.1
INFO:target-jira:Skipping parse of env var settings...
2025-07-18 01:37:28,712 | INFO | target-jira | Target 'target-jira' is listening for input from tap.
2025-07-18 01:37:28,713 | INFO | target-jira | Adding sink for stream: accounts with schema keys: ['Id', 'Name']
2025-07-18 01:37:28,713 | INFO | target-jira.accounts | Initializing target sink for stream 'accounts'...
2025-07-18 01:37:28,716 | INFO | target-jira | Sink added: <target_jira.sinks.JiraSink object at 0x000001B75A72D070>
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Scripts\target-jira.exe\__main__.py", line 7, in <module>
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\click\core.py", line 1442, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\click\core.py", line 1363, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\plugin_base.py", line 84, in invoke
return super().invoke(ctx)
^^^^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\click\core.py", line 1226, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\click\core.py", line 794, in invoke
return callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\target_base.py", line 572, in invoke
target.listen(file_input)
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\_singerlib\encoding\_base.py", line 48, in listen
self._process_lines(file_input or self.default_input)
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\target_base.py", line 304, in _process_lines
counter = super()._process_lines(file_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\_singerlib\encoding\_base.py", line 70, in _process_lines
self._process_record_message(line_dict)
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\target_base.py", line 344, in _process_record_message
self._assert_sink_exists(stream_map.stream_alias)
File "C:\Siba_\Work\LocalSetup\Backend\backend_05062025\backend_2304\.meltano\loaders\target-jira\venv\Lib\site-packages\singer_sdk\target_base.py", line 280, in _assert_sink_exists
raise RecordsWithoutSchemaException(msg)
singer_sdk.exceptions.RecordsWithoutSchemaException: A record for stream 'accounts' was encountered before a corresponding schema. Check that the Tap correctly implements the Singer spec.
I have attached few files for reference.David Dobrinskiy
07/18/2025, 12:25 PMtarget-clickhouse?
I'm having some difficulties in setting mine up, e.g. I want new tables to be deduplicated on id with sorting by a non-nullable updated_at.
But the target-clickhouse module always casts dates as Nullable: https://github.com/shaped-ai/target-clickhouse/blob/a04758cff46b429bab6615a15d662e25c5a96db9/target_clickhouse/connectors.py#L120
Since this is my first meltano ETL pipeline, I'm getting confused between layers of abstraction here and how to properly configure my target clickhouse tables.Tanner Wilcox
07/22/2025, 4:52 PMEllis Valentiner
07/25/2025, 1:19 PMSteven Searcy
08/04/2025, 5:01 PMAdam Wegscheid
08/05/2025, 7:41 PMRob Norman
08/07/2025, 7:02 AMSELECT <list of columns> FROM <table> ORDER BY `created_at` ASC;
My tap configuration is pretty basic, there's nothing weird going on:
plugins:
extractors:
- name: tap-mysql
variant: transferwise
pip_url:
git+<https://github.com/transferwise/pipelinewise.git#subdirectory=singer-connectors/tap-mysql>
select:
- ${TAP_MYSQL_DATABASE}-<table>.*
settings:
- name: engine
value: mysql
config:
session_sqls:
- SET @@session.wait_timeout=28800
- SET @@session.net_read_timeout=3600
- SET @@session.innodb_lock_wait_timeout=3600
metadata:
'*-<table>':
replication-method: INCREMENTAL
replication-key: created_at
Am I just fundamentally missing something because trying to read the entire table in one go seems insane to me? Do I need to set the replication-method to FULL_TABLE for the first load and manually fiddle the state or something?Tanner Wilcox
08/11/2025, 10:04 PMTanner Wilcox
08/13/2025, 6:55 PMuv handle this?Quoc Nguyen
08/19/2025, 5:37 AM3.6 to 3.7 ? Back then, with 3.6 , when running sth like meltano run tap-posgres target-redshift , the console will show both internal logs from meltano and external logs from meltano plugins (`tap-postgres`/`target-redshift`). But when upgrading to 3.7 , it only shows the internal logs from meltano.
So in our use case, the logs from plugins are really important because they're inputs for our monitoring systems. So is there any way to make it work similarly to the version 3.6 ? Not sure if this is a dumb question or not. If so, then is there any doc that I can read more about this since I'm new to meltano 😄 Thank you sirs a lot in advanceTanner Wilcox
09/15/2025, 9:17 PMTanner Wilcox
09/25/2025, 5:00 PMapi_url) but I wanted to check here. Otherwise I'll have to have two almost identical .meltano.yml filesTanner Wilcox
10/02/2025, 2:59 PMEllis Valentiner
10/24/2025, 2:12 PMstart_date but I believe end_date is not a thing? Does anyone have a recommendation for conditionally replicating data?Eric Mills
10/24/2025, 3:31 PM