Tiger Sun
02/24/2023, 12:49 AM{
"level": "fatal",
"ts": "2023-02-24T00:23:43.647Z",
"msg": "error starting scanner",
"service": "worker",
"error": "context deadline exceeded",
"logging-call-at": "service.go:233",
"stacktrace": "<http://go.temporal.io/server/common/log.(*zapLogger).Fatal\n\t/temporal/common/log/zap_logger.go:150\ngo.temporal.io/server/service/worker.(*Service).startScanner\n\t/temporal/service/worker/service.go:233\ngo.temporal.io/server/service/worker.(*Service).Start\n\t/temporal/service/worker/service.go:153\ngo.temporal.io/server/service/worker.ServiceLifetimeHooks.func1.1\n\t/temporal/service/worker/fx.go:80|go.temporal.io/server/common/log.(*zapLogger).Fatal\n\t/temporal/common/log/zap_logger.go:150\ngo.temporal.io/server/service/worker.(*Service).startScanner\n\t/temporal/service/worker/service.go:233\ngo.temporal.io/server/service/worker.(*Service).Start\n\t/temporal/service/worker/service.go:153\ngo.temporal.io/server/service/worker.ServiceLifetimeHooks.func1.1\n\t/temporal/service/worker/fx.go:80>"
}
xi-chen.qi
02/24/2023, 1:33 AMChen Lin
02/24/2023, 2:22 AMHai Huynh
02/24/2023, 4:02 AMHamid Shariati
02/24/2023, 9:19 AMErik Alfthan
02/24/2023, 9:54 AM2023-02-24 09:32:49 INFO i.a.w.t.TemporalAttemptExecution(get):138 - Cloud storage job log path: /workspace/41a0d0ed-961c-42d7-9130-6b533d235309/0/logs.log
2023-02-24 09:32:49 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.32
2023-02-24 09:32:49 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation
2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 -
2023-02-24 09:32:49 INFO i.a.w.p.KubeProcessFactory(create):98 - Attempting to start pod = source-mssql-check-41a0d0ed-961c-42d7-9130-6b533d235309-0-aaori for airbyte/source-mssql:0.4.28 with resources io.airbyte.config.ResourceRequirements@5ca586f[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] and allowedHosts null
2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2023-02-24 09:32:49 INFO i.a.c.i.LineGobbler(voidCall):114 -
EDIT: Adding some details on my setup:
I'm using the helm chart, version 0.43.29 with and external azure flexible postgres, and enabled TLS on the temporal server (setting extraEnvs SQL_TLS and SQL_TLS_ENABLED)Shraddha Borkar
02/24/2023, 10:55 AMJulien F
02/24/2023, 1:31 PM2023-02-24 09:50:15 [32mINFO[m i.a.c.t.StreamResetRecordsHelper(deleteStreamResetRecordsForJob):50 - deleteStreamResetRecordsForJob was called for job 2706 with config type sync. Returning, as config type is not resetConnection.
I looked at docker logs but couldn’t find anything. Does someone have an idea of what is happening? Everything works well the rest of the timeAn R
02/24/2023, 4:35 PMERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
has anybody experienced this?Robert Put
02/24/2023, 5:41 PMKrzysztof
02/24/2023, 6:08 PMIgor Safonov
02/24/2023, 6:34 PMstream_slices
method.
Is there a way to compute next slice depending on the state after the previous slice?
My problem.
Source HTTP GET api supports parameters: start_datetime
, end_datetime
, max_rows
(the last one has the maximum value of 1_000_000).
I need to download 2023-02-20...2023-02-22.
The slices are [2023-02-20, 2023-02-21, 2023-02-22].
But if I have more than max_rows
entries to fetch in one date (e.g. 1_200_000, then 200_000 entries get lost), because the cdk goes to the next slice that has been computed before, and there is no way to know how many rows in advance.Gunnar Lykins
02/24/2023, 6:45 PM0.44.1
, and are running into an error regarding STRICT_COMPARISON_NORMALIZATION_TAG
. It appears that this PR resolves this issue, when is this planned to be merged with master? Thanks! 🙂
cc: @Shashank SinghKonstantin Lackner
02/24/2023, 7:00 PMproducts
stream. Can someone help? Very much appreciated!
2023-02-24 18:57:24 normalization > Traceback (most recent call last):
2023-02-24 18:57:24 normalization > File "/usr/local/bin/transform-catalog", line 8, in <module>
2023-02-24 18:57:24 normalization > sys.exit(main())
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 104, in main
2023-02-24 18:57:24 normalization > TransformCatalog().run(args)
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 36, in run
2023-02-24 18:57:24 normalization > self.process_catalog()
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 64, in process_catalog
2023-02-24 18:57:24 normalization > processor.process(catalog_file=catalog_file, json_column_name=json_col, default_schema=schema)
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/catalog_processor.py", line 64, in process
2023-02-24 18:57:24 normalization > stream_processor.collect_table_names()
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/stream_processor.py", line 227, in collect_table_names
2023-02-24 18:57:24 normalization > for child in self.find_children_streams(self.from_table, column_names):
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/stream_processor.py", line 361, in find_children_streams
2023-02-24 18:57:24 normalization > elif is_combining_node(properties[field]):
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/utils.py", line 120, in is_combining_node
2023-02-24 18:57:24 normalization > if data_type.ONE_OF_VAR_NAME in properties and any(
2023-02-24 18:57:24 normalization > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/utils.py", line 121, in <genexpr>
2023-02-24 18:57:24 normalization > data_type.WELL_KNOWN_TYPE_VAR_NAME in option[data_type.REF_TYPE_VAR_NAME] for option in properties[data_type.ONE_OF_VAR_NAME]
2023-02-24 18:57:24 normalization > KeyError: '$ref'
Konstantin Lackner
02/24/2023, 7:08 PMactive_users
- so the daily, weekly and four weekly. However all other tables are empty! Can someone point me in a direction on how to debug this? Would be greatly appreciated.
1. Airbyte version: 0.40.32
2. GA4 Connector version: 0.1.1
3. Find the logs attached in the threadShashank Singh
02/24/2023, 7:09 PMMayank V
02/24/2023, 7:32 PMMichael Taylor
02/24/2023, 8:13 PMFailure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
when I run and if I dig through hundreds of lines of log files I see a message:
pyarrow.lib.ArrowInvalid: In CSV column #6: CSV conversion error to int64: invalid value '1,830'
Is this related to the normalizer? Is there an option I can set to either - leave things as strings or handle the commas in the numbers ? I can't redo thousands of csv file formats prior to loading. I hope I am missing something simple here.Mike B
02/24/2023, 8:44 PMBruno Azevedo
02/24/2023, 11:20 PMstream_slices
method as full-refresh
but does not ingest any data, and after that, it runs the correct stream_slices
method with the correct date:
I put a `logger.info(slices)`inside the connector and here are the results:Balasubramanian T K
02/25/2023, 8:32 AMBalasubramanian T K
02/25/2023, 8:34 AMBalasubramanian T K
02/25/2023, 9:02 AMMinhaj Pasha
02/25/2023, 2:16 PMMinhaj Pasha
02/25/2023, 2:16 PMSrikanth Sudhindra
02/25/2023, 3:25 PMAvi Sagal
02/26/2023, 3:44 PMi.am
02/26/2023, 7:45 PMCheryl Z
02/27/2023, 8:31 AMAlexander Schmidt
02/27/2023, 9:31 AM