Thomas Pedot
08/15/2022, 7:27 PMSlackbot
08/16/2022, 4:47 AMsuman
08/16/2022, 8:26 AMVitalie CALMÎC
08/16/2022, 9:56 AMMark McKelvey
08/16/2022, 3:37 PMFrederik Hagelund
08/16/2022, 7:15 PMlook R
08/16/2022, 8:01 PMArun
08/17/2022, 3:32 AMKevin Y
08/17/2022, 3:47 AMFull refresh | Overwrite
. However, the moment I try Incremental | Deduped + history
I get an error. What am I missing?Shubham Mahajan
08/17/2022, 5:26 AMGaurav Borse
08/17/2022, 6:24 AMAditi Bhalawat
08/17/2022, 6:35 AM2022-08-15 10:33:37 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):325 - Records read: 225000 (203 MB) 2022-08-15 10:42:39 destination > 2022-08-15 10:42:39 INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(lambda$flushAll$1):86 - Flushing <stream_name>: 7435 records (24 MB)
2022-08-15 10:52:16 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$3):191 - Running sync worker cancellation...
2022-08-15 10:52:16 INFO i.a.w.g.DefaultReplicationWorker(cancel):444 - Cancelling replication worker...
2022-08-15 10:52:26 INFO i.a.w.g.DefaultReplicationWorker(cancel):452 - Cancelling destination...
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteDestination(cancel):125 - Attempting to cancel destination process...
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteDestination(cancel):130 - Destination process exists, cancelling...
2022-08-15 10:52:26 INFO i.a.w.g.DefaultReplicationWorker(run):175 - One of source or destination thread complete. Waiting on the other.
2022-08-15 10:52:26 WARN i.a.c.i.LineGobbler(voidCall):86 - airbyte-destination gobbler IOException: Stream closed. Typically happens when cancelling a job.
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteDestination(cancel):132 - Cancelled destination process!
2022-08-15 10:52:26 INFO i.a.w.g.DefaultReplicationWorker(cancel):459 - Cancelling source...
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteSource(cancel):142 - Attempting to cancel source process...
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteSource(cancel):147 - Source process exists, cancelling...
2022-08-15 10:52:26 INFO i.a.w.g.DefaultReplicationWorker(run):177 - Source and destination threads complete.
2022-08-15 10:52:26 INFO i.a.w.i.DefaultAirbyteSource(cancel):149 - Cancelled source process!
2022-08-15 10:52:26 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$3):195 - Interrupting worker thread...
2022-08-15 10:52:26 INFO i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$3):198 - Cancelling completable future...
2022-08-15 10:52:26 WARN i.a.w.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
2022-08-15 10:52:26 WARN i.a.w.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
What could be the issue here?
I am assuming it is because my data is more than 200MB.If this is how could I resolve it?Gaurav Borse
08/17/2022, 6:36 AMAbdullah Alsaleh
08/17/2022, 10:26 AMWeb Cloud
08/17/2022, 7:05 PMFailureReason@4d70ea94[failureOrigin=<null>,failureType=system_error,internalMessage=java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: unimplemented: multiple active portals not supported
airbyte-server | Detail: cannot perform operation sql.PrepareStmt while a different portal is open
airbyte-server | Hint: You have attempted to use a feature that is not yet implemented.
airbyte-server | See: <https://go.crdb.dev/issue-v/40195/v22.1,externalMessage=Something> went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@7d4ee3a3[additionalProperties={attemptNumber=null, jobId=null, from_trace_message=true}],stacktrace=java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: unimplemented: multiple active portals not supported
airbyte-server | Detail: cannot perform operation sql.PrepareStmt while a different portal is open
airbyte-server | Hint: You have attempted to use a feature that is not yet implemented.
airbyte-server | See: <https://go.crdb.dev/issue-v/40195/v22.1>
airbyte-server | at io.airbyte.db.jdbc.StreamingJdbcDatabase$1.tryAdvance(StreamingJdbcDatabase.java:100)
airbyte-server | at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
airbyte-server | at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
airbyte-server | at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
airbyte-server | at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:575)
airbyte-server | at java.base/java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
airbyte-server | at java.base/java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:616)
airbyte-server | at java.base/java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:622)
airbyte-server | at java.base/java.util.stream.ReferencePipeline.toList(ReferencePipeline.java:627)
airbyte-server | at io.airbyte.db.jdbc.JdbcDatabase.queryJsons(JdbcDatabase.java:170)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSource.getPrivilegesTableForCurrentUser(PostgresSource.java:383)
airbyte-server | at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:124)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSource.discoverRawTables(PostgresSource.java:201)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSource.discoverInternal(PostgresSource.java:183)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSource.discoverInternal(PostgresSource.java:73)
airbyte-server | at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:147)
airbyte-server | at io.airbyte.integrations.source.relationaldb.AbstractDbSource.getTables(AbstractDbSource.java:337)
airbyte-server | at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:97)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSource.discover(PostgresSource.java:166)
airbyte-server | at io.airbyte.integrations.base.ssh.SshTunnel.sshWrap(SshTunnel.java:205)
airbyte-server | at io.airbyte.integrations.base.ssh.SshWrappedSource.discover(SshWrappedSource.java:45)
airbyte-server | at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:127)
airbyte-server | at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:97)
airbyte-server | at io.airbyte.integrations.base.adaptive.AdaptiveSourceRunner$Runner.run(AdaptiveSourceRunner.java:86)
airbyte-server | at io.airbyte.integrations.source.postgres.PostgresSourceRunner.main(PostgresSourceRunner.java:15)
airbyte-server | Caused by: org.postgresql.util.PSQLException: ERROR: unimplemented: multiple active portals not supported
airbyte-server | Detail: cannot perform operation sql.PrepareStmt while a different portal is open
airbyte-server | Hint: You have attempted to use a feature that is not yet implemented.
airbyte-server | See: <https://go.crdb.dev/issue-v/40195/v22.1>
Wilfredo Molina
08/17/2022, 10:32 PMlin.liu
08/18/2022, 6:35 AMdocker-compose up
to start a Airbyte Server on AWS server
And then I use Airbyte Web to create and start a connection, A docker container started to execute some tasks.
Now I want to know, Can I config Airbyte to run it's worker(or connection tasks) on specific AWS server ? Because some API be called just on specific public IP address.lin.liu
08/18/2022, 6:37 AMJohnny Degreef
08/18/2022, 11:39 AMM
08/18/2022, 1:02 PMTomi
08/18/2022, 1:10 PMCould not connect with provided configuration. Private key provided is invalid or not supported: rsa_key.p8: Cannot invoke "<http://net.snowflake.client.jdbc.internal.org.bouncycastle.util.io|net.snowflake.client.jdbc.internal.org.bouncycastle.util.io>.pem.PemObject.getContent()" because the return value of "<http://net.snowflake.client.jdbc.internal.org.bouncycastle.util.io|net.snowflake.client.jdbc.internal.org.bouncycastle.util.io>.pem.PemReader.readPemObject()" is null
When connecting to snowflake using snowsql the key/pair auth is working with the same keys. What are we missing?Arun
08/18/2022, 4:42 PMJerri Comeau (Airbyte)
08/18/2022, 5:14 PMRegitze Sdun
08/18/2022, 1:10 PMSebastian Brickel
08/19/2022, 8:30 AMCannot reach the server. The server may still be starting up.
for over 1 hour now.
I looked at the recommendations in https://docs.airbyte.com/troubleshooting/on-deploying/ and I do get error messages in both docker logs airbyte-server
and airbyte-worker
. Since I am fairly new to airbyte I am not sure what those messages mean. Can anyone help me with this?
Thankskylashpriya NA
08/19/2022, 9:43 AMalpha
. From this point i can absolutely not recommend to use an alpha verions in production environments. Was it carefully evaluated, tested, decided to really use it for a production environment? Here you can find what is an alpha versions and what are the risks: Software release life cycle
Could someone helps us with the above? Is that still in alpha phase or we shall try with “stable” release?
We have passed setup documentation page as : https://docs.airbyte.com/quickstart/deploy-airbyte/?_ga=2.89522395.1160840054.1659428690-1169156739.1659428688Marissa Pagador
08/19/2022, 7:02 PMRajesh Koilpillai
08/20/2022, 5:04 AMdef next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
return None
def request_params(
self, stream_state: Mapping[str, Any], stream_slice: Mapping[str, any] = None, next_page_token: Mapping[str, Any] = None
) -> MutableMapping[str, Any]:
return {}
def parse_response(self, response: requests.Response, **kwargs) -> Iterable[Mapping]:
yield {}
What type of source connector will be the right choice to use here? Should we use Python HTTP connector, Python Source connector or Python Generic Source connector ?Arsh Anwar
08/20/2022, 6:09 PMTaha Shalaby
08/21/2022, 5:27 AM