Octavia Squidington III
07/10/2023, 7:45 PMMark McKelvey
07/10/2023, 10:12 PMP W
07/10/2023, 11:42 PMChidambara Ganapathy
07/11/2023, 12:43 AMNazif Ishrak
07/11/2023, 5:44 AMEkansh Verma
07/11/2023, 7:34 AMNisha Biradar
07/11/2023, 8:05 AMEkansh Verma
07/11/2023, 9:27 AMRutger Weemhoff
07/11/2023, 9:40 AMWebhook test failed. Please verify that the webhook URL is valid.
On my previous version (0.43.2) I had exactly the same issue so it does not seem to be related to my recent version upgrade.
I am running Airbyte on Docker.
The webhook urls work correctly if I test them with curl
from the virtual machine where docker runs, or even from the docker container.Moti Zamir
07/11/2023, 11:21 AMDipankar Kumar
07/11/2023, 11:35 AMNadav Amami
07/11/2023, 11:48 AMStan Gorch
07/11/2023, 12:14 PMlisandro maselli
07/11/2023, 12:57 PMOctavia Squidington III
07/11/2023, 1:45 PMRichard Anthony Hein (Auxon)
07/11/2023, 2:41 PMŠimon Appelt
07/11/2023, 3:17 PM_airbyte_raw
tables and nothing else, is this expected behaviour for the new version?Joey Hernandez
07/11/2023, 3:23 PMStan Gorch
07/11/2023, 3:31 PMJoey Hernandez
07/11/2023, 3:34 PMSlackbot
07/11/2023, 5:31 PMBrian Mullins
07/11/2023, 7:38 PMEkansh Verma
07/11/2023, 8:27 PMTony Cookey
07/11/2023, 9:02 PMid
either a connectionId
or a randomId
when streaming data from multiple sources to a single destination (Github <<>> Postgres) so as to identify what data came from what connection.
I want to stream multiple Github Sources to one Postgres. I want to tag each row and know what connection(stream) owns that row of data.
Please what are the possible ways of doing this. Open to solutions
Thanksmangole
07/11/2023, 9:20 PMRhys Davies
07/11/2023, 9:40 PMFull Refresh | Overwrite
mode DROPs
the tables and writes them fresh, which also drops any views I have that are dependent on these tables and I am at the stage of this project where I need to start transforming the warehoused data.
I can’t currently do this though because in testing the connection fails every time with CDC enabled Caused by: java.time.format.DateTimeParseException: Text '-2208988800000' could not be parsed at index 11
with this error. To be clear these are the same tables and same data as the other Source I have set up for this same database. It seemingly fails when the date is set to 1900-01-01 00:00:00.00
but I also note that Airbyte sees all date/time fields in the SQL Server as String
.
Is there any way I can fix this because it’s a real blocker, I am happy to roll up my sleeves and just write some Python and deploy my syncing service but it seems like such a huge fall at the last hurdle for Airbyte and this project I’m working on…
Thanks in advance!Sai Charan
07/12/2023, 6:00 AMSlackbot
07/12/2023, 6:24 AMEkansh Verma
07/12/2023, 6:46 AMlogs:
accessKey:
password: "ACCESS_KEY"
existingSecret: ""
existingSecretKey: ""
secretKey:
password: "SECRET_KEY"
existingSecret: ""
existingSecretKey: ""
storage:
type: "S3"
minio:
enabled: false
externalMinio:
enabled: false
host: localhost
port: 9000
s3:
enabled: true
bucket: "bucket_name"
bucketRegion: "us-east-1"
The Bucket exists and the credentials are valid and have the correct access to updated files on the bucket as well.
Now while after the installation the worker and server fail due to
static void validateBase(final S3ApiWorkerStorageConfig s3BaseConfig) {
Preconditions.checkArgument(!s3BaseConfig.getBucketName().isBlank());}
This validation check which essentially means my bucket name is blank.
Any reasons why I am encountering this as the bucket name is clearly mentioned?Caio César P. Ricciuti
07/12/2023, 7:09 AMairbyte_raw_mydata
and a table named mydata
now I just get the airbyte_raw_
tables... anyone had this issue?
mysql source v2.1.0
bigquery destination v1.5.1