jeremiah ishaya
12/29/2022, 8:17 PMUzair Ahmad
12/30/2022, 7:53 AMSébastien Haentjens
12/30/2022, 12:30 PM...
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(runFlywayMigration):398 - Migrating jobs database
2022-12-30 11:44:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Flyway Community Edition 7.14.0 by Redgate
2022-12-30 11:44:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Successfully validated 11 migrations (execution time 00:00.007s)
2022-12-30 11:44:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Current version of schema "public": 0.40.18.002
2022-12-30 11:44:28 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Schema "public" is up to date. No migration necessary.
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(load):216 - Ran Flyway migrations.
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(createWorkspaceIfNoneExists):317 - workspace already exists for the deployment.
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(load):219 - Default workspace created.
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(createDeploymentIfNoneExists):307 - running deployment: 5c1e816f-dada-4e28-bcf2-fd430ae877c1
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(load):222 - Default deployment created.
2022-12-30 11:44:28 INFO i.a.b.BootloaderApp(load):225 - Set version to AirbyteVersion{version='0.40.26', major='0', minor='40', patch='26'}
2022-12-30 11:44:28 INFO i.a.c.p.ActorDefinitionMigrator(updateConfigsFromSeed):72 - Updating connector definitions from the seed if necessary...
2022-12-30 11:44:28 INFO i.a.c.p.ActorDefinitionMigrator(updateConfigsFromSeed):75 - Connectors in use: [airbyte/destination-snowflake, <http://464622532012.dkr.ecr.us-east-1.amazonaws.com/dd-airbyte|464622532012.dkr.ecr.us-east-1.amazonaws.com/dd-airbyte>]
Exception in thread "main" java.lang.IllegalArgumentException: Invalid version string: source-s3-0.1.18
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:145)
at io.airbyte.commons.version.Version.<init>(Version.java:37)
...
I’m trying to deploy on k8sIlkka Peltola
12/30/2022, 1:06 PMUserLocationPerformanceReport
is not included in the Bing Ads connector?
I can get that through Fivetran (it's pretty large), but also Meltano doesn't support it.ewan lottering
01/02/2023, 9:16 AMRK
01/02/2023, 10:08 AMRK
01/02/2023, 10:08 AMRK
01/02/2023, 10:09 AMRK
01/02/2023, 10:09 AMNaren Kadiri
01/02/2023, 11:08 AMKrzysztof
01/02/2023, 11:47 AMKrzysztof
01/02/2023, 11:48 AMKrzysztof
01/02/2023, 11:48 AMKrzysztof
01/02/2023, 11:48 AMTemidayo Azeez
01/02/2023, 1:26 PMState code: 08S01; Message: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
I am using Ubuntu 20.04, trying to create mysql source connector, and I have been getting the error above. I don't know what to do to get the issue solved. Help will be very appreciatedSvatopluk Chalupa
01/02/2023, 1:27 PMSvatopluk Chalupa
01/02/2023, 1:36 PMMd. Mizanur Rahman Iftee
12/28/2022, 7:05 AMAbdi Darmawan
12/28/2022, 7:26 AMMarcos Marx (Airbyte)
01/02/2023, 3:19 PMSean Zicari
01/02/2023, 8:34 PMVats Vana
01/03/2023, 2:34 AMVats Vana
01/03/2023, 2:38 AMArjunsingh Yadav
01/03/2023, 8:52 AM[2023-01-03, 07:08:39 UTC] {base.py:73} INFO - Using connection ID 'airbyte_conn' for task execution.
[2023-01-03, 07:08:39 UTC] {http.py:150} INFO - Sending 'POST' to url: <http://localhost:8001/api/v1/connections/sync>
[2023-01-03, 07:08:39 UTC] {local_task_job.py:159} INFO - Task exited with return code Negsignal.SIGSEGV
The airbyte connection has S3(cloud) -> Postgres(local) connection and works fine when synced manually. I’m not being able to sync it via airflow.
Moreover, the API call to 8001(webapp) server of airbyte requires a basic auth but I didnt find any config to put those creds in airflow.
The DAG is as follows -
from airflow import DAG
from airflow.utils.dates import days_ago
from airflow.providers.airbyte.operators.airbyte import AirbyteTriggerSyncOperator
with DAG(dag_id='airbyte_job',
default_args={'owner': 'admin'},
schedule_interval='@daily',
start_date=days_ago(1)
) as dag:
airbyte_trigger = AirbyteTriggerSyncOperator(
task_id='airbyte_airflow',
airbyte_conn_id='airbyte_conn',
connection_id='****',
asynchronous=False,
timeout=3600,
wait_seconds=3)
Another possible way to sync it is making an API call from airflow -> airbyte without using airflow’s airbyte provider(https://pypi.org/project/apache-airflow-providers-airbyte/)
Can somebody give a curl for running the sync on airbyte?Mario Beteta
01/03/2023, 9:29 AMThe provided configuration does not fulfill the specification. Errors: invalid cron expression
What could be wrong?Mario Beteta
01/03/2023, 10:48 AMjan-hendrik Hoon
01/03/2023, 11:46 AMfullname
var is being used to which includes a trunc 63
to prevent creating anything with names that are longer than allowed by k8s, though this is not being used for service names, in this case {{ .Release.Name }}-airbyte-connector-builder-server-svc
(which is already really long without the release name 😅) I only really see it being used in the env-configmap and would like to know if and how we could possibly work on a solution. More context to follow in thread:Stewart Fohlo
01/03/2023, 12:19 PMRenato Todorov
01/03/2023, 12:29 PMcheck_connection
endpoint times out after 3 seconds. I've logged the issue here: https://github.com/airbytehq/airbyte/issues/20963. Any help is appreciated.Santiago Stachuk
01/03/2023, 1:18 PMraise ValueError(f"Unexpected type for data_or_message: {type(data_or_message)}: {data_or_message}")
ValueError: Unexpected type for data_or_message: <class 'collections.ChainMap'>: ChainMap({...},{...},{...})
inside this collection.ChainMap
is all the data I need to be emitted, with correct values.
I just modified a line in the ga4
connector, where in the stream_slices method I add a start_date = start_date - datetime.timedelta(days=2)
to make sure the data is golden as per this article