Armand
04/16/2022, 12:36 AMSatish Chinthanippu
05/13/2022, 4:58 AMSatish Chinthanippu
05/13/2022, 5:00 AMCraig Condie
06/15/2022, 6:04 AMCraig Condie
06/16/2022, 11:55 PMShaik Shahid Afridi
07/11/2022, 11:03 AMNeil Leonard
07/15/2022, 6:21 PMGarrett McClintock
08/31/2022, 5:36 PMMurat Cetink
10/05/2022, 10:33 PMaidan
10/07/2022, 8:41 AMRahul Borse
10/11/2022, 10:47 AMGergely Imreh
10/13/2022, 2:39 AMNwani Victory
10/25/2022, 3:43 PMJeff Skoldberg
10/26/2022, 3:07 AMgenerate.sh
. )
Now I'm back in the Windows environment and I'm at the pip install -r requirements.txt
step, but I get this:
ERROR: Could not find a version that satisfies the requirement pywin32==227; sys_platform == "win32" (from docker) (from versions: 302, 303, 304)
ERROR: No matching distribution found for pywin32==227; sys_platform == "win32"
• should I not try to continue the steps in Windows?
• Is there going to be better support for developing on Windows in the future?
• Any advice on the error
Thanks!Arvind Patel
10/28/2022, 4:28 AMScott Chua
10/28/2022, 5:08 AMpython main.py check --config secrets/config.json
?
EmailOctopus ’s API docs say,
“If you’re making a JSON request, include a Content-Type: application/json
header.”
so I want to check what the actual requests being sent look like… 😄Hridya Agrawal
10/30/2022, 12:06 PMHridya Agrawal
10/30/2022, 12:07 PMSatish Chinthanippu
10/31/2022, 5:28 AMCaused by: io.airbyte.workers.exception.WorkerException: Normalization Failed.
at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:93) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:27) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:159) ~[io.airbyte-airbyte-workers-0.40.9.jar:?]
... 1 more
2022-10-31 05:02:04 [42mnormalization[0m > usage: transform-config --config CONFIG --integration-type
2022-10-31 05:02:04 [42mnormalization[0m > {DestinationType.BIGQUERY,DestinationType.CLICKHOUSE,DestinationType.MSSQL,DestinationType.MYSQL,DestinationType.ORACLE,DestinationType.POSTGRES,DestinationType.REDSHIFT,DestinationType.SNOWFLAKE,DestinationType.TIDB}
2022-10-31 05:02:04 [42mnormalization[0m > --out OUT
2022-10-31 05:02:04 [42mnormalization[0m > transform-config: error: argument --integration-type: invalid DestinationType value: 'teradata'
2022-10-31 05:02:04 [42mnormalization[0m > Traceback (most recent call last):
2022-10-31 05:02:04 [42mnormalization[0m > File "/usr/local/bin/transform-catalog", line 8, in <module>
2022-10-31 05:02:04 [42mnormalization[0m > sys.exit(main())
2022-10-31 05:02:04 [42mnormalization[0m > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 104, in main
2022-10-31 05:02:04 [42mnormalization[0m > TransformCatalog().run(args)
2022-10-31 05:02:04 [42mnormalization[0m > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 35, in run
2022-10-31 05:02:04 [42mnormalization[0m > self.parse(args)
2022-10-31 05:02:04 [42mnormalization[0m > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 46, in parse
2022-10-31 05:02:04 [42mnormalization[0m > profiles_yml = read_profiles_yml(parsed_args.profile_config_dir)
2022-10-31 05:02:04 [42mnormalization[0m > File "/usr/local/lib/python3.9/site-packages/normalization/transform_catalog/transform.py", line 75, in read_profiles_yml
2022-10-31 05:02:04 [42mnormalization[0m > with open(os.path.join(profile_dir, "profiles.yml"), "r") as file:
2022-10-31 05:02:04 [42mnormalization[0m > FileNotFoundError: [Errno 2] No such file or directory: '/data/21/2/normalize/profiles.yml'
Can anyone help to resolve this issue.Satish Chinthanippu
10/31/2022, 6:02 AMHenri Blancke
11/02/2022, 6:59 PMMarlon Gómez
11/04/2022, 2:23 PMRequested normalization for marlon/mydestconn:v1.0.0, but it is not included in the normalization mappings.
Can anyone point me to the right direction to support normalization in my destination connector?Rachel RIZK
11/04/2022, 5:09 PMfield_pointer
with DpathExtractor, but my data is nested inside a list like this:
testdata = { 'data': [
{'actual_data': { 'lotsofdata' },
{'metadata': 2},
{'null': 3},
],
}
If I set field_pointer: ["data"]
it works fine but I actually only want the value of "actual_data"
, which is inside a list.
I've tried field_pointer: ["data[0]", "actual_data"]
but it doesn't work, and it won't allow any string interpolation.
Is there a way to retrieve data within a list with DpathExtractor? 😅
Thanks for your help 🙏JJ Nilbodee
11/07/2022, 2:38 PMRachel RIZK
11/07/2022, 4:38 PMincrement - append
data instead of full refresh - overwrite/append
with the low-code CDK.
Here's how the data is structured:
testdata = {
'rows': [
{
'data_row1': [{ 'kpi': 2, 'date': "2022-11-07" }],
'metadata_row1': {'id': 1}
},
{
'data_row2': [{ 'kpi': 8, 'date': "2022-11-07" }],
'metadata_row2': {'id': 2}
},
]
}
• To get all rows, I need to set record_selector
on rows
field
• But to get the date field (enabling increment with slicing) I need to set the cursor_field
to something like that: ["data_row1", "0", "date"]
or data_row1[0]['date']
.
◦ With 1st solution, I have an error because it won't accept a list
◦ With 2nd solution, the script runs but the cursor is not working and displays older data than what's in sample_state.json
I'm wondering, is there actually a way to activate increment on this kind of data?
It's the last step before I can contribute for a new connector 🙏Aazam Thakur
11/10/2022, 6:47 AMSatish Chinthanippu
11/10/2022, 4:26 PM2022-11-10 14:49:33 INFO o.t.i.RemoteDockerImage(resolve):75 - Pulling docker image: postgres:13-alpine. Please be patient; this may take some time but only needs to be done once.
2022-11-10 14:49:34 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit%22}|https://www.docker.com/increase-rate-limit"}>
at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:247) ~[testcontainers-1.17.3.jar:?]
at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) ~[testcontainers-1.17.3.jar:?]
at java.lang.Thread.run(Thread.java:833) ~[?:?]
2022-11-10 14:49:34 WARN o.t.i.RemoteDockerImage(resolve):105 - Retrying pull for image: postgres:13-alpine (119s remaining)
2022-11-10 14:49:35 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit%22}|https://www.docker.com/increase-rate-limit"}>
at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:247) ~[testcontainers-1.17.3.jar:?]
at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) ~[testcontainers-1.17.3.jar:?]
at java.lang.Thread.run(Thread.java:833) ~[?:?]
2022-11-10 14:49:35 WARN o.t.i.RemoteDockerImage(resolve):105 - Retrying pull for image: postgres:13-alpine (118s remaining)
> Task :airbyte-api:compileJava
Note: /root/airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/invoker/generated/JSON.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
> Task :airbyte-db:jooq:generateConfigsDatabaseJooq
2022-11-10 14:49:35 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit%22}|https://www.docker.com/increase-rate-limit"}>
Rachel RIZK
11/11/2022, 9:37 AMlookback_window
was not working as expected for incremental syncs 😬
• it's only set on the original start_date
but not in the cursor_field
used to do the increment
• so it's never doing any lookback after the first sync
• after some research, I've stumbled upon this issue that was open 3mo ago
Since it's a marketing-related connector (with an attribution window) it's a bit painful because we can't use Incremental syncs in that case.
By any chance, has anyone encountered a similar issue and found a potential workaround?...Marlon Gómez
11/14/2022, 6:54 PMKrzysztof
11/16/2022, 12:12 PM