laila ribke
11/08/2022, 3:32 PMlaila ribke
11/08/2022, 3:37 PMAustin Poulton
11/08/2022, 4:08 PMDaniel Vengoechea
11/08/2022, 4:28 PMDaniel Vengoechea
11/08/2022, 4:29 PMPavan
11/08/2022, 5:33 PMAlexandre Voyer
11/08/2022, 7:06 PMColeman Kelleghan
11/08/2022, 8:24 PM2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):381 - Total records read: 16928693 (31 GB)
2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(cancel):501 - Cancelling destination...
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):121 - Attempting to cancel destination process...
2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):190 - One of source or destination thread complete. Waiting on the other.
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):126 - Destination process exists, cancelling...
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):664 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Destroying Kube process.
2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):192 - Source and destination threads complete.
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Closed all resources for pod
2022-11-08 19:42:36 [33mWARN[m i.a.c.i.LineGobbler(voidCall):119 - airbyte-destination gobbler IOException: Socket closed. Typically happens when cancelling a job.
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):670 - (pod: homelander-airbyte-external / destination-postgres-write-30-0-cekpy) - Destroyed Kube process.
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteDestination(cancel):128 - Cancelled destination process!
2022-11-08 19:42:36 [32mINFO[m i.a.w.g.DefaultReplicationWorker(cancel):508 - Cancelling source...
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):141 - Attempting to cancel source process...
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):146 - Source process exists, cancelling...
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):664 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Destroying Kube process.
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Closed all resources for pod
2022-11-08 19:42:36 [33mWARN[m i.a.c.i.LineGobbler(voidCall):119 - airbyte-source gobbler IOException: Socket closed. Typically happens when cancelling a job.
2022-11-08 19:42:36 [32mINFO[m i.a.w.p.KubePodProcess(destroy):670 - (pod: homelander-airbyte-external / source-snowflake-read-30-0-dbrwk) - Destroyed Kube process.
2022-11-08 19:42:36 [32mINFO[m i.a.w.i.DefaultAirbyteSource(cancel):148 - Cancelled source process!
2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$5):230 - Interrupting worker thread...
2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$5):233 - Cancelling completable future...
2022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
2022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
2022-11-08 19:42:36 [1;31mERROR[m i.a.w.WorkerUtils(gentleClose):53 - Exception while while waiting for process to finish
022-11-08 19:42:36 [33mWARN[m i.a.c.t.CancellationHandler$TemporalCancellationHandler(checkAndHandleCancellation):53 - Job either timed out or was cancelled.
2022-11-08 19:42:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
2022-11-08 19:42:36 [1;31mERROR[m i.a.w.WorkerUtils(gentleClose):53 - Exception while while waiting for process to finish
java.lang.InterruptedException: sleep interrupted
at java.lang.Thread.sleep0(Native Method) ~[?:?]
at java.lang.Thread.sleep(Thread.java:465) ~[?:?]
at java.lang.Process.waitFor(Process.java:468) ~[?:?]
at io.airbyte.workers.process.KubePodProcess.waitFor(KubePodProcess.java:653) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.WorkerUtils.gentleClose(WorkerUtils.java:50) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:128) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:194) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
at java.lang.Thread.run(Thread.java:1589) ~[?:?]
2022-11-08 19:42:36 [32mINFO[m i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating...
2022-11-08 19:42:36 [1;31mERROR[m i.a.w.g.DefaultReplicationWorker(run):196 - Sync worker failed.
io.airbyte.workers.exception.WorkerException: Source process exit with code 143. This warning is normal if the job was cancelled.
at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:135) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:194) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
at java.lang.Thread.run(Thread.java:1589) ~[?:?]
Suppressed: io.airbyte.workers.exception.WorkerException: Destination process exit with code 143. This warning is normal if the job was cancelled.
at io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:115) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:151) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:71) ~[io.airbyte-airbyte-commons-worker-0.40.17.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$4(TemporalAttemptExecution.java:190) ~[io.airbyte-airbyte-workers-0.40.17.jar:?]
at java.lang.Thread.run(Thread.java:1589) ~[?:?]
Leo G
11/08/2022, 10:00 PMDuncan Reyneke
11/08/2022, 10:25 PMBrad Nemetski
11/09/2022, 12:25 AMZack Peacock
11/09/2022, 2:13 AMgcloud --project=$PROJECT_ID beta compute SSH $INSTANCE_NAME -- -L 8000:localhost:8000 -N -f
I’m getting the following error in terminal as of this week:
ERROR: (gcloud.beta.compute) Invalid choice: 'SSH'.
Maybe you meant:
gcloud compute ssh
gcloud compute config-ssh
gcloud beta compute addresses update
gcloud beta compute backend-services add-iam-policy-binding
gcloud beta compute backend-services get-iam-policy
gcloud beta compute backend-services remove-iam-policy-binding
gcloud beta compute backend-services set-iam-policy
gcloud beta compute commitments update-reservations
gcloud beta compute instance-groups managed all-instances-config delete
gcloud beta compute instance-groups managed all-instances-config update
Haven’t been able to figure what I need to do, but feel like it’s got to be simple. Any help would be greatly appreciated.Alex Sher
11/09/2022, 5:00 AMAndreas Nigg
11/09/2022, 7:20 AMPankaj Gupta
11/09/2022, 7:58 AMDheeraj Pranav
11/09/2022, 8:05 AMChetan Dalal
11/09/2022, 9:19 AMAshish Narang
11/09/2022, 11:30 AMstate.json
is not getting updated. From the checkpoint i'm getting messages logs that new data is fetched but state.json points to the first date which i provided in config.json file.
Sharing below the code for Stream:
class ExchangeRates(HttpStream, IncrementalMixin):
# url_base = "<https://api.apilayer.com/exchangerates_data/>"
url_base = "<https://api.exchangerate.host/>"
cursor_field = "date"
primary_key = "date"
def __init__(self, config: Mapping[str, Any], start_date: datetime, **kwargs):
super().__init__()
self.base = config['base']
self.access_key = config['access_key']
self.start_date = start_date
self._cursor_value = None
@property
def state(self) -> Mapping[str, Any]:
if self._cursor_value:
return {self.cursor_field: self._cursor_value.strftime('%Y-%m-%d')}
else:
return {self.cursor_field: self.start_date.strftime('%Y-%m-%d')}
@state.setter
def state(self, value: Mapping[str, Any]):
self._cursor_value = datetime.strptime(value[self.cursor_field], '%Y-%m-%d')
def read_records(self, *args, **kwargs) -> Iterable[Mapping[str, Any]]:
for record in super().read_records(*args, **kwargs):
if self._cursor_value:
latest_record_date = datetime.strptime(record[self.cursor_field], '%Y-%m-%d')
self._cursor_value = max(self._cursor_value, latest_record_date)
yield record
def _chunk_date_range(self, start_date: datetime) -> List[Mapping[str, Any]]:
"""
Returns a list of each day between the start date and now.
The return value is a list of dicts {'date': date_string}.
"""
dates = []
while start_date < datetime.now():
dates.append({self.cursor_field: start_date.strftime('%Y-%m-%d')})
start_date += timedelta(days=1)
return dates
def stream_slices(self, sync_mode, cursor_field: List[str] = None, stream_state: Mapping[str, Any] = None) -> Iterable[Optional[Mapping[str, Any]]]:
start_date = datetime.strptime(stream_state[self.cursor_field], '%Y-%m-%d') if stream_state and self.cursor_field in stream_state else self.start_date
return self._chunk_date_range(start_date)
def path(self, stream_state: Mapping[str, Any] = None, stream_slice: Mapping[str, Any] = None, next_page_token: Mapping[str, Any] = None) -> str:
return stream_slice['date']
def request_params(
self,
stream_state: Mapping[str, Any],
stream_slice: Mapping[str, Any] = None,
next_page_token: Mapping[str, Any] = None,
) -> MutableMapping[str, Any]:
# The api requires that we include access_key as a query param so we do that in this method
return {'apikey': self.access_key}
def parse_response(
self,
response: requests.Response,
stream_state: Mapping[str, Any],
stream_slice: Mapping[str, Any] = None,
next_page_token: Mapping[str, Any] = None,
) -> Iterable[Mapping]:
# The response is a simple JSON whose schema matches our stream's schema exactly,
# so we just return a list containing the response
return [response.json()]
def next_page_token(self, response: requests.Response) -> Optional[Mapping[str, Any]]:
# The API does not offer pagination,
# so we return None to indicate there are no more pages in the response
return None
return None
Below is the command and messages i'm getting in terminal:
(airbyte) ashish@ashish-ThinkPad-E14-Gen-2:~/repos/airbyte/airbyte-integrations/connectors/source-python-http-rates-api$ python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.json --state sample_files/state.json
{"type": "LOG", "log": {"level": "INFO", "message": "Starting syncing SourcePythonHttpRatesApi"}}
{"type": "LOG", "log": {"level": "INFO", "message": "Syncing stream: exchange_rates "}}
Inside state setter
-- 2022-11-07 00:00:00
{"type": "LOG", "log": {"level": "INFO", "message": "Setting state of exchange_rates stream to {'date': '2022-11-07'}"}}
=================================================================================
{"type": "STATE", "state": {"type": "STREAM", "stream": {"stream_descriptor": {"name": "exchange_rates"}, "stream_state": {"date": "2022-11-08"}}, "data": {"exchange_rates": {"date": "2022-11-08"}}}}
===================================================
{"type": "STATE", "state": {"type": "STREAM", "stream": {"stream_descriptor": {"name": "exchange_rates"}, "stream_state": {"date": "2022-11-09"}}, "data": {"exchange_rates": {"date": "2022-11-09"}}}}
{"type": "LOG", "log": {"level": "INFO", "message": "Read 3 records from exchange_rates stream"}}
{"type": "LOG", "log": {"level": "INFO", "message": "Finished syncing exchange_rates"}}
{"type": "LOG", "log": {"level": "INFO", "message": "SourcePythonHttpRatesApi runtimes:\nSyncing stream exchange_rates 0:00:00.384562"}}
{"type": "LOG", "log": {"level": "INFO", "message": "Finished syncing SourcePythonHttpRatesApi"}}
In above logs it's clear that we are fetching date for 2022-11-07
2022-11-08
2022-11-09
dates starting from 2022-11-07
But my state.json always has this fixed value
{
"exchange_rates": {
"date": "2022-11-07"
}
}
can someone help what i'm doing wrong?laila ribke
11/09/2022, 11:31 AMSatish Chinthanippu
11/09/2022, 12:42 PM> Task :airbyte-api:compileJava
/airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/generated/InternalApi.java:66: error: cannot find symbol
memberVarAsyncResponseInterceptor = apiClient.getAsyncResponseInterceptor();
^
symbol: method getAsyncResponseInterceptor()
location: variable apiClient of type ApiClient
/airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/generated/AttemptApi.java:62: error: cannot find symbol
memberVarAsyncResponseInterceptor = apiClient.getAsyncResponseInterceptor();
^
symbol: method getAsyncResponseInterceptor()
location: variable apiClient of type ApiClient
Note: /airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/invoker/generated/JSON.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
M. Zein Ihza Fahrozi
11/09/2022, 12:52 PMkube/resources/cron.yaml
have defined volume. but in helm charts isn’t
volumes:
- name: airbyte-volume-configs
persistentVolumeClaim:
claimName: airbyte-volume-configs
related issue #16698Dheeraj Pranav
11/09/2022, 1:30 PMDusty Shapiro
11/09/2022, 2:09 PMflow
11/09/2022, 2:34 PMjonty
11/09/2022, 4:26 PMAdham Suliman
11/09/2022, 5:26 PMbackwards compatibility test
. I’ve been told that the issue is due to the test not finding one of the schemas (conversations.json), but it’s clearly there. Any advice would be highly appreciated!
Slack ConversationGuy Feldman
11/09/2022, 5:47 PMDomenic
11/09/2022, 6:31 PMdocker-compose up
command. Its been several hours and it is still not complete. I can see in Powershell that it is executing (isnt stalled), but not sure if this is normal. Thoughts?Zack Peacock
11/09/2022, 6:34 PMVenkat Dasari
11/09/2022, 6:47 PM