Bram
11/28/2022, 12:25 PMRamon de la Cruz Ariza
11/28/2022, 1:27 PM---
apiVersion: <http://kustomize.config.k8s.io/v1beta1|kustomize.config.k8s.io/v1beta1>
kind: Kustomization
helmCharts:
- name: airbyte
version: 0.42.0
repo: <https://airbytehq.github.io/helm-charts>
namespace: airbyte
releaseName: airbyte
But we wanna add /modify the variables to the existing configuration (like enable the local basic http auth), and we have some questions:
• Can I overwrite this file somehow? I saw it’s possible if we have the configMapGenerator enabled and also the .env
file locally
• What happens if I have this local file and I want to have an external db? (using the values for the helm chart it’s possible, but will this overwrite another configuration?
Many thanks!!!Rahul Borse
11/28/2022, 2:08 PMlaila ribke
11/28/2022, 2:08 PMJuan Chaves
11/28/2022, 5:03 PMCarolina Buckler
11/28/2022, 5:48 PMThe connection tests failed.
'An error occurred: {"error":"invalid_grant","error_description":"expired access/refresh token"}'
Any guidance on how to reset the refresh token?Semyon Komissarov
11/28/2022, 6:43 PMRoyzac
11/29/2022, 12:59 AMAdam
11/29/2022, 1:29 AMKrishna Elangovan
11/29/2022, 2:10 AMCould not connect with provided configuration. Error: The variable "log_bin" should be set to "ON", but it is "OFF"
even though its set to “ON” on my MySql host, this happens only when trying to connect with CDC enabled with standard it just works fine, what could be wrong here ?ANISH R
11/29/2022, 8:43 AM2022-11-29 08:39:36 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/b88f61d3-0dac-450d-be8e-890756668ee2/0/logs.log
2022-11-29 08:39:36 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.40.15
2022-11-29 08:39:36 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-redshift:0.3.15 exists...
2022-11-29 08:39:36 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-redshift:0.3.15 was found locally.
2022-11-29 08:39:36 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-redshift-discover-b88f61d3-0dac-450d-be8e-890756668ee2-0-meimj with resources io.airbyte.config.ResourceRequirements@66d3a24c[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-29 08:39:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/b88f61d3-0dac-450d-be8e-890756668ee2/0 --log-driver none --name source-redshift-discover-b88f61d3-0dac-450d-be8e-890756668ee2-0-meimj --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-redshift:0.3.15 -e AIRBYTE_VERSION=0.40.15 -e WORKER_JOB_ID=b88f61d3-0dac-450d-be8e-890756668ee2 airbyte/source-redshift:0.3.15 discover --config source_config.json
2022-11-29 08:39:37 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - starting source: class io.airbyte.integrations.source.redshift.RedshiftSource
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - integration args: {discover=null, config=source_config.json}
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - Running integration: io.airbyte.integrations.source.redshift.RedshiftSource
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - Command: DISCOVER
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - Integration config: IntegrationConfig{command=DISCOVER, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-11-29 08:39:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):106 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-29 08:39:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):106 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Starting...
2022-11-29 08:39:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Start completed.
2022-11-29 08:39:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Driver does not support get/set network timeout for connections. ([Amazon][JDBC](10220) Driver does not support this optional feature.)
2022-11-29 08:39:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal]
2022-11-29 08:39:40 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Shutdown initiated...
2022-11-29 08:39:40 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Shutdown completed.
2022-11-29 08:39:40 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):105 - Something went wrong in the connector. See the logs for more details.
Stack Trace: java.lang.NullPointerException: null value in entry: isNullable=null
at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33)
at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54)
at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:339)
at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:449)
at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:267)
at io.airbyte.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:81)
at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
at io.airbyte.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:56)
at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:190)
at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:90)
at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30)
at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:241)
at io.airbyte.integrations.source.relationaldb.AbstractDbSource.getTables(AbstractDbSource.java:498)
at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:110)
at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:127)
at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:97)
at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:136)
2022-11-29 08:39:40 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
404 Roy
11/29/2022, 8:47 AMNikolai Nergård
11/29/2022, 8:47 AMthomas trividic
11/29/2022, 9:02 AMthomas trividic
11/29/2022, 9:02 AMthomas trividic
11/29/2022, 9:02 AMgeography: auto
schemaChange: no_change
notifySchemaChanges: true
nonBreakingChangesPreference: ignore
thomas trividic
11/29/2022, 9:02 AMBenedikt Buchert
11/29/2022, 9:58 AMBhavya Verma
11/29/2022, 10:47 AMAndreas
11/29/2022, 11:20 AMnoobolte bawa
11/29/2022, 11:20 AMSantosh
11/29/2022, 12:09 PM데이브
11/29/2022, 3:02 PMStuart Horgan
11/29/2022, 3:29 PMVERSION=dev docker-compose up
I get this error:
Error response from daemon: manifest for airbyte/worker:dev not found: manifest unknown: manifest unknown
Anyone know what is happening? Didn't think I would hit an error so instantly when trying to start a clean re-installBrad Nemetski
11/29/2022, 3:39 PMColeman Kelleghan
11/29/2022, 4:38 PMRahul Borse
11/29/2022, 5:21 PMAlexandre Chouraki
11/29/2022, 6:00 PMdef check_connection(self, logger, config) -> Tuple[bool, any]:
try:
args = self.convert_config2stream_args(config)
stream = ExportPatients(**args)
print(stream.auth.token)
records = stream.read_records(sync_mode=SyncMode.full_refresh)
next(records)
return True, None
except Exception as e:
return False, e
But when testing it, I get these logs :
{"type": "LOG", "log": {"level": "ERROR", "message": "[{\"errorCode\":\"TOKEN_NOT_FOUND\",\"componentType\":\"AUTH\",\"message\":\"JWT Token is not present in request headers\",\"details\":{},\"date\":\"2022-11-29T17:34:52.907Z\"}]"}}
{"type": "LOG", "log": {"level": "ERROR", "message": "Check failed"}}
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "FAILED", "message": "HTTPError('401 Client Error: for url: <https://api.live.welkincloud.io/{redacted}/{redacted}/export/PATIENT>')"}}
Even though I do have a token set up in stream.auth, it's printing well, and doing
tok = "the_token_that_was_printed"
h = {
"Authorization": "Bearer {}".format(tok)
}
r = requests.get(f"<https://api.live.welkincloud.io/{redacted}/{redacted}/export/PATIENT>", headers=h)
in Python yields a 200 instead of a 401...
I can't fathom why that won't work, if the token can be found... Is there a way to check the headers of a request from the stream object? I could definitely use some help!Grember Yohan
11/29/2022, 6:21 PMKalaSai
11/29/2022, 6:33 PM