https://linen.dev logo
#troubleshooting
Title
# troubleshooting
a

Aditya Rane

03/15/2022, 1:17 AM
Is this your first time deploying Airbyte: Yes OS Version / Instance: Amazon Linux Memory / Disk: 8 GB Deployment: Docker Airbyte Version: 0.35.46-alpha Source name/version: MSSQL 0.3.17 Destination name/version: Snowflake 0.4.20 Description: 2022-03-15 011306 ERROR i.a.w.DefaultReplicationWorker(run):168 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1 • Can some one please help me with the internal staging connection with snowflake is there any script which I am suppose to run and missing out. I have also given ownership and usage for all future stages to the role AIRBYTE_ROLE but it stills fails.
o

Octavia Squidington III

03/15/2022, 8:06 AM
loading...
h

Harshith (Airbyte)

03/15/2022, 8:45 AM
a

Aditya Rane

03/15/2022, 9:18 AM
@Harshith (Airbyte) I am still stuck with the issue and as per the documentation I have already given access to Table and Schema to the Airbyte role I am using but still the same error and not loading any data in my snowflake tables. It is creating stage and tmp and raw table along with the permanent table but not emitting any data inside it
m

Marcos Marx (Airbyte)

03/15/2022, 8:20 PM
@Aditya Rane are you using GCS or S3 staging method? Or is it the internal Snowflake? If it’s the former please follow docs how to setup it: https://docs.snowflake.com/en/sql-reference/sql/create-stage.html
a

Aditya Rane

03/15/2022, 8:52 PM
@Marcos Marx (Airbyte) I am using internal staging method as Airbyte creates its own stage in snowflake and I just gave the usage privilege on the parent database and schema for the role I created but it was failing. I also used the S3 bucket for staging where it creates 4 small files in S3 and only takes the latest file and upload it to snowflake. Should I use external staging in snowflake something like this:
Copy code
create stage my_ext_stage1
  url='<s3://load/files/>'
  credentials=(aws_key_id='1a2b3c' aws_secret_key='4x5y6z');
Will this work?
@Harshith (Airbyte) @Marcos Marx (Airbyte) We gave AccountAdmin access to Airbyte still internal staging does not work and when we try S3 staging till not copying all the data. Is there any quick fix and can you please help us with this issue.
m

Marcos Marx (Airbyte)

03/17/2022, 1:33 AM
I also used the S3 bucket for staging where it creates 4 small files in S3 and only takes the latest file and upload it to snowflake. Should I use external staging in snowflake something like this:
so it’s “working” data are landing in Snowflake? The problem now it’s only loading data from one file from s3? similar to: https://github.com/airbytehq/airbyte/issues/11052
a

Aditya Rane

03/17/2022, 3:05 AM
@Marcos Marx (Airbyte) correct
m

Marcos Marx (Airbyte)

03/18/2022, 1:57 AM
Team is working in a solution