Rami M Theeb
12/06/2022, 9:14 PM2022-12-06T16:53:37.423923573Z stdout F {"level":"warn","ts":"2022-12-06T16:53:37.423Z","msg":"history size exceeds warn limit.","shard-id":4,"address":"10.1.1.248:7234","component":"history-cache","wf-namespace-id":"769c5c50-2fb9-4b30-a8d3-ea88dd398f4c","wf-id":"connection_manager_84a09e16-67ec-432b-9cc3-1224516d6ad2","wf-run-id":"a3dd8fcd-19b6-4fc9-8faf-b7c3075bf0c0","wf-history-size":32381433,"wf-event-count":4380,"logging-call-at":"context.go:920"}
2022-12-06T16:52:07.390203639Z stdout F java.lang.IllegalStateException: Transitioning Job 1881 from JobStatus SUCCEEDED to INCOMPLETE is not allowed. The only valid statuses that an be transitioned to from SUCCEEDED are []
2022-12-06T18:52:07+02:00
2022-12-06T16:52:07.390170567Z stdout F 2022-12-06 16:52:07
WARN
i.t.i.a.ActivityTaskExecutors$BaseActivityTaskExecutor(execute):114 - Activity failure. ActivityId=d551e5c8-e3ea-336d-82e8-c03469ea67b7, activityType=AttemptFailureWithAttemptNumber, attempt=1
2022-12-06T18:51:49+02:00
2022-12-06T16:51:49.228566415Z stdout F 2022-12-06 16:51:49
INFO
i.a.w.t.s.ConnectionManagerWorkflowImpl(runMandatoryActivityWithOutput):628 - Waiting PT10M before restarting the workflow for connection 1acb8818-fdba-4780-840e-d552bd2d4684, to prevent spamming temporal with restarts.
theses errors keep showing in the logs, and i can’t delete the connection for some reason too, any help ?Haritha Gunawardana
12/06/2022, 10:51 PMTmac Han
12/07/2022, 1:22 AMPraveenraaj K S
12/07/2022, 4:49 AMYusuf Fahry
12/07/2022, 5:28 AMTmac Han
12/07/2022, 7:23 AMNick Scheifler
12/07/2022, 8:03 AMSebastian Brickel
12/07/2022, 8:22 AMAleksandar
12/07/2022, 9:10 AMAdd new conntector
in Settings -> New Connector. I do not see any error.
It does not appear in the sources and I cannot see it anywhere.
I cannot understand if there is some problem with my image or I am missing some step in how I make my source visible in the UI.
Any help will be greatly appreciate!
NOTE: Locally I had the same problem and connect to the airbyte database (e.g. airbyte-db
) and update in actor_definition table
the field public=true
.
After that I am able to see it.Steven
12/07/2022, 11:18 AMFailed to finalize copy to temp table due to: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key
We're using the Azure Blob Storage data source instead of the default S3.
We can see that Airbyte is writing files into the Azure Blob Storage account correctly, but we think Databricks might have trouble accessing that same location. We have tried multiple things:
• adding the Azure Blob Storage account as an "external location" in the Databricks Unity Catalog with the associated storage credential.
• writing directly to the managed Azure Blob Storage account
• the Databricks Access Connector has been added as a storage contributor to the storage account
• all network rules are open
• we've tried both the regular Azure Blob Storage and ADLS Gen2
• We've correctly set up the connection from Airbyte to the Azure Blob Storage account using the SAS token auth.
• The "test connection" button in Airbyte works correctly.
Any pointers?
---
Actually solved this just now. Apparently you should direct Airbyte towards an actual Databricks compute cluster. Directing it towards a SQL Warehouse does NOT work.
Now getting another error, which does seem to be a problem in the Databricks Lakehouse integration itself:
CREATE TABLE public._airbyte_tmp_tvd_pokemon (_airbyte_ab_id string, _airbyte_emitted_at string, `abilities` , `base_experience` , `forms` , `game_indices` , `height` , `held_items` , `id` , `is_default ` , `location_area_encounters` , `moves` , `name` , `order` , `species` , `sprites` , `stats` , `types` , `weight` ) USING csv LOCATION 'abfss:REDACTED_LOCAL_PART@REDACTED.dfs.core.windows.net/d2b9f209-e36e-451c-9a66-6d29a712c699/public/_airbyte_tmp_tvd_pokemon/' options ("header" = "true", "multiLine" = "true")
Looks like the types of the columns are missing. I guess something's going wrong here? https://github1s.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/[…]ination/databricks/DatabricksAzureBlobStorageStreamCopier.javaSteven
12/07/2022, 11:26 AMKalaSai
12/07/2022, 11:58 AMERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Failed to inject value for parameter [secretPersistence] of method [secretsHydrator] of class: io.airbyte.config.persistence.split_secrets.SecretsHydrator
Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence').
Aleksei Abisev
12/07/2022, 12:10 PMPaypal transactions
source
Everything looks fine, but it always return 0 transactions. No errors, no nothing. Balances stream is returning records.
2022-12-07 11:51:50 source > Starting syncing SourcePaypalTransaction
2022-12-07 11:51:50 source > Syncing stream: transactions
2022-12-07 11:51:52 source > Maximum allowed start_date is 2022-12-07 09:29:59+00:00 based on info from API response
2022-12-07 11:53:13 source > Read 0 records from transactions stream
2022-12-07 11:53:13 source > Finished syncing transactions
2022-12-07 11:53:13 source > SourcePaypalTransaction runtimes:
Syncing stream transactions 0:01:23.353576
When sending direct requests to api endpoint https://api-m.paypal.com/v1/reporting/transactions transactions are returned properly. Any suggestions what should I try?Jonny Wray
12/07/2022, 12:54 PMDaryl Thomas
12/07/2022, 2:17 PMBrian Olsen
12/07/2022, 2:35 PM{"dtype" : "object"}
to see if I was formatting something wrong. It still shows all strings in the connection though.KalaSai
12/07/2022, 2:58 PMJosé Ferraz Neto
12/07/2022, 3:41 PMJonny Wray
12/07/2022, 5:48 PMEmma Forman Ling
12/07/2022, 6:12 PMsupportsDBT: false
and support append
sync mode, so I think that blog post is just outdated.Marcel Coetzee
12/07/2022, 6:20 PM.parquet
postfix, instead of something more descriptive like .snappy.parquet
KalaSai
12/07/2022, 8:07 PMSam Stoelinga
12/07/2022, 8:14 PMJaye Howell
12/07/2022, 8:21 PMFailure Origin: airbyte_platform, Message: Something went wrong within the airbyte platform
1:46PM 12/07
2022-12-07 20:15:36 - Additional Failure Information: scheduledEventId=72, startedEventId=73, activityType='RunWithJobOutput', activityId='d9c45409-2372-30d6-ae8f-29bdb4949aa2', identity='', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
I have downloaded the full logs
here is the last part of the log... note, this was trying to write to S3 instead of snowflake as a test.
22-12-07 20:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - integration args: {check=null, config=source_config.json}
2022-12-07 20:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Running integration: io.airbyte.integrations.destination.s3.S3Destination
2022-12-07 20:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Command: CHECK
2022-12-07 20:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-12-07 20:05:33 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-12-07 20:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - S3 format config: {"format_type":"Avro","compression_codec":{"codec":"no compression"}}
2022-12-07 20:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Creating S3 client...
2022-12-07 20:05:37 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Started testing if IAM user can call listObjects on the destination bucket
2022-12-07 20:05:39 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Finished checking for listObjects permission
2022-12-07 20:05:39 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Started testing if all required credentials assigned to user for single file uploading
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Finished checking for normal upload mode
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Started testing if all required credentials assigned to user for multipart upload
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Initiated multipart upload to mudflap.data.snowflake.loadstage//airbyte/test_1670443540085 with full ID MMGkcQrUGmDTk7znWge4J9yrJ9nXqRkFZu8QZ4G.ZNB3LipHsG64nk.nms6wXPMaFjf5RUQon4Wg2E5PhRmQtqphGE2j3Z62l4zegSJj0Xazg1jzokYue88h1NrWeHubM.vEC8TW8veSIip6r72r_3QX2oz7FKRostkQ1TCEOns-
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-12-07 20:05:40 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - [MultipartOutputStream for parts 1 - 10000] is already closed
2022-12-07 20:05:40 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to mudflap.data.snowflake.loadstage//airbyte/test_1670443540085 with id MMGkcQrUG...Q1TCEOns-]: Uploading leftover stream [Part number 1 containing 3.34 MB]
2022-12-07 20:05:41 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to mudflap.data.snowflake.loadstage//airbyte/test_1670443540085 with id MMGkcQrUG...Q1TCEOns-]: Finished uploading [Part number 1 containing 3.34 MB]
2022-12-07 20:05:41 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to mudflap.data.snowflake.loadstage//airbyte/test_1670443540085 with id MMGkcQrUG...Q1TCEOns-]: Completed
2022-12-07 20:05:41 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Finished verification for multipart upload mode
2022-12-07 20:05:41 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed integration: io.airbyte.integrations.destination.s3.S3Destination
2022-12-07 20:05:41 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed destination: io.airbyte.integrations.destination.s3.S3Destination
2022-12-07 20:05:44 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: data-eng / destination-s3-check-11-0-nixlg) - Closed all resources for pod
2022-12-07 20:05:44 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
2022-12-07 20:05:44 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2022-12-07 20:05:44 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2022-12-07 20:05:44 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2022-12-07 20:05:44 [33mWARN[m i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = d9c45409-2372-30d6-ae8f-29bdb4949aa2, ActivityType = RunWithJobOutput, WorkflowId=connection_manager_145da5f5-a373-4b9d-8957-0383c8946517, WorkflowType=ConnectionManagerWorkflow, RunId=68b13988-1ec6-48cf-8475-3aa10c94288e
io.grpc.StatusRuntimeException: NOT_FOUND: invalid activityID or activity already timed out or invoking workflow is completed
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.respondActivityTaskCompleted(WorkflowServiceGrpc.java:3840) ~[temporal-serviceclient-1.17.0.jar:?]
Tomer Mesika
12/07/2022, 9:08 PMWalker Philips
12/07/2022, 9:44 PMSam Stoelinga
12/07/2022, 9:52 PMmanifest.json:1 GET <http://localhost:8000/manifest.json> 401 (Unauthorized)
manifest.json:1 Manifest: Line: 1, column: 1, Syntax error.
apiOverride.ts:57 POST <http://localhost:8000/api/v1/web_backend/connections/list> 500 (Internal Server Error)
(anonymous) @ apiOverride.ts:57
d @ regeneratorRuntime.js:86
(anonymous) @ regeneratorRuntime.js:66
(anonymous) @ regeneratorRuntime.js:117
r @ asyncToGenerator.js:3
s @ asyncToGenerator.js:25
(anonymous) @ asyncToGenerator.js:32
(anonymous) @ asyncToGenerator.js:21
(anonymous) @ apiOverride.ts:24
ae @ AirbyteClient.ts:3276
value @ WebBackendConnectionService.ts:17
(anonymous) @ useConnectionHook.tsx:243
fetchFn @ query.js:298
l @ retryer.js:95
c @ retryer.js:156
t.fetch @ query.js:330
n.fetchOptimistic @ queryObserver.js:180
(anonymous) @ useBaseQuery.js:84
A @ useQuery.js:7
o @ useSuspenseQuery.ts:19
O @ useConnectionHook.tsx:243
le @ AllConnectionsPage.tsx:23
sa @ react-dom.production.min.js:157
Gs @ react-dom.production.min.js:267
Au @ react-dom.production.min.js:250
Ou @ react-dom.production.min.js:250
Cu @ react-dom.production.min.js:250
_u @ react-dom.production.min.js:243
(anonymous) @ react-dom.production.min.js:123
t.unstable_runWithPriority @ scheduler.production.min.js:18
Vi @ react-dom.production.min.js:122
Ki @ react-dom.production.min.js:123
M @ scheduler.production.min.js:16
b.port1.onmessage @ scheduler.production.min.js:12
react_devtools_backend.js:4012 Error: Internal Server Error: Duplicate key 99a5b8cc-01ce-4a02-946c-a33d91b3d1b2 (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@3825d1c9[id=<null>,actorId=99a5b8cc-01ce-4a02-946c-a33d91b3d1b2,actorCatalogId=5be6c2c9-692c-467b-a052-56ccc9915c7c,configHash=<null>,connectorVersion=<null>,createdAt=1670444899] and io.airbyte.config.ActorCatalogFetchEvent@4e9e2fd1[id=<null>,actorId=99a5b8cc-01ce-4a02-946c-a33d91b3d1b2,actorCatalogId=e1f51180-5fb3-4f7d-b509-17b6dbcf0675,configHash=<null>,connectorVersion=<null>,createdAt=1670444899])
at apiOverride.ts:107:9
at d (regeneratorRuntime.js:86:17)
at Generator._invoke (regeneratorRuntime.js:66:24)
at Generator.next (regeneratorRuntime.js:117:21)
at r (asyncToGenerator.js:3:20)
at s (asyncToGenerator.js:25:9)
overrideMethod @ react_devtools_backend.js:4012
onError @ query.js:356
h @ retryer.js:67
(anonymous) @ retryer.js:132
Promise.catch (async)
l @ retryer.js:116
c @ retryer.js:156
t.fetch @ query.js:330
n.fetchOptimistic @ queryObserver.js:180
(anonymous) @ useBaseQuery.js:84
A @ useQuery.js:7
o @ useSuspenseQuery.ts:19
O @ useConnectionHook.tsx:243
le @ AllConnectionsPage.tsx:23
sa @ react-dom.production.min.js:157
Gs @ react-dom.production.min.js:267
Au @ react-dom.production.min.js:250
Ou @ react-dom.production.min.js:250
Cu @ react-dom.production.min.js:250
_u @ react-dom.production.min.js:243
(anonymous) @ react-dom.production.min.js:123
t.unstable_runWithPriority @ scheduler.production.min.js:18
Vi @ react-dom.production.min.js:122
Ki @ react-dom.production.min.js:123
M @ scheduler.production.min.js:16
b.port1.onmessage @ scheduler.production.min.js:12
2react_devtools_backend.js:4012 Error: Internal Server Error: Duplicate key 99a5b8cc-01ce-4a02-946c-a33d91b3d1b2 (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@3825d1c9[id=<null>,actorId=99a5b8cc-01ce-4a02-946c-a33d91b3d1b2,actorCatalogId=5be6c2c9-692c-467b-a052-56ccc9915c7c,configHash=<null>,connectorVersion=<null>,createdAt=1670444899] and io.airbyte.config.ActorCatalogFetchEvent@4e9e2fd1[id=<null>,actorId=99a5b8cc-01ce-4a02-946c-a33d91b3d1b2,actorCatalogId=e1f51180-5fb3-4f7d-b509-17b6dbcf0675,configHash=<null>,connectorVersion=<null>,createdAt=1670444899])
at apiOverride.ts:107:9
at d (regeneratorRuntime.js:86:17)
at Generator._invoke (regeneratorRuntime.js:66:24)
at Generator.next (regeneratorRuntime.js:117:21)
at r (asyncToGenerator.js:3:20)
at s (asyncToGenerator.js:25:9)
Emma Forman Ling
12/07/2022, 10:35 PMSeowan Lee
12/08/2022, 3:21 AMhelm install --values /Users/bagelcode/Workspace/deploy-core-eks/values/airbyte.dev.yaml airbyte-dev-adhoc airbyte/airbyte --debug 1 err | data-core-eks-dev kube | 11:58:31 AM
install.go:173: [debug] Original chart version: ""
install.go:190: [debug] CHART PATH: /Users/bagelcode/Library/Caches/helm/repository/airbyte-0.40.40.tgz
client.go:282: [debug] Starting delete for "airbyte-dev-admin" ServiceAccount
client.go:122: [debug] creating 1 resource(s)
client.go:282: [debug] Starting delete for "airbyte-dev-adhoc-airbyte-env" ConfigMap
client.go:311: [debug] configmaps "airbyte-dev-adhoc-airbyte-env" not found
client.go:122: [debug] creating 1 resource(s)
client.go:282: [debug] Starting delete for "airbyte-dev-adhoc-airbyte-secrets" Secret
client.go:311: [debug] secrets "airbyte-dev-adhoc-airbyte-secrets" not found
client.go:122: [debug] creating 1 resource(s)
client.go:282: [debug] Starting delete for "airbyte-dev-adhoc-airbyte-bootloader" Pod
client.go:311: [debug] pods "airbyte-dev-adhoc-airbyte-bootloader" not found
client.go:122: [debug] creating 1 resource(s)
client.go:491: [debug] Watching for changes to Pod airbyte-dev-adhoc-airbyte-bootloader with timeout of 5m0s
client.go:519: [debug] Add/Modify event for airbyte-dev-adhoc-airbyte-bootloader: ADDED
client.go:578: [debug] Pod airbyte-dev-adhoc-airbyte-bootloader pending
client.go:519: [debug] Add/Modify event for airbyte-dev-adhoc-airbyte-bootloader: MODIFIED
client.go:580: [debug] Pod airbyte-dev-adhoc-airbyte-bootloader running
Error: failed pre-install: timed out waiting for the condition
helm.go:81: [debug] failed pre-install: timed out waiting for the condition
I cannot figure out what is the problem. I found slack archive, which is the same as my problem but has no solution. Could you help me how troubleshoot this problem? https://airbytehq.slack.com/archives/C021JANJ6TY/p1665660532221119?thread_ts=1665656538.448909&cid=C021JANJ6TYPiyush
12/08/2022, 4:02 AM