Sheshan
01/08/2023, 2:57 PM(.venv) sheshan@sheshan:~/Documents/Airbyte/airbyte/airbyte-integrations/connectors/destination-weav-destination$ cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files peeyushweav/weav_destination-custom-python:c4prod write --config /secrets/config.json --catalog /sample_files/configured_catalog.json
{"type": "LOG", "log": {"level": "INFO", "message": "Begin writing to the destination..."}}
{"type": "LOG", "log": {"level": "FATAL", "message": "write() missing 1 required positional argument: 'logger'\nTraceback (most recent call last):\n File \"/airbyte/integration_code/main.py\", line 11, in <module>\n DestinationWeavDestination().run(sys.argv[1:])\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 108, in run\n for message in output_messages:\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 103, in run_cmd\n yield from self._run_write(config=config, configured_catalog_path=parsed_args.catalog, input_stream=wrapped_stdin)\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/destinations/destination.py\", line 47, in _run_write\n yield from self.write(config=config, configured_catalog=catalog, input_messages=input_messages)\nTypeError: write() missing 1 required positional argument: 'logger'"}}
Chen Huang
01/08/2023, 4:19 PMissues
stream, the API /rest/api/3/search
has a limit of 100 rows of changelogs. To get the full history of changelogs, we have to call another API /rest/api/3/issue/{issueIdOrKey}/changelog
when we know the issue ids with more than 100 changelogs in the history.
I'm wondering if anyone has the experience, and how is this solved? ThanksSandeep Yadav
01/08/2023, 10:31 PMCONTAINER_ALREADY_STARTED_PLACEHOLDER
is not getting created within container, Looking at this bash script i feel container will through this error again again unless this command works touch $CONTAINER_ALREADY_STARTED
. Please help with your assistance 🙏
Done init
PostgreSQL started.
touch: CONTAINER_ALREADY_STARTED_PLACEHOLDER: Permission denied
2023-01-08T22:26:34.633Z ERROR Unable to create SQL database. {"error": "pq: database \"temporal\" already exists", "logging-call-at": "handler.go:97"}
2023-01-08T22:26:34.653Z INFO Starting schema setup {"config": {"SchemaFilePath":"","InitialVersion":"0.0","Overwrite":false,"DisableVersioning":false}, "logging-call-at": "setuptask.go:57"}
Kyle Cheung
01/09/2023, 3:39 AM''
?Stuart Horgan
01/09/2023, 9:39 AMVu Le Hoang
01/09/2023, 9:48 AMcustom fields
in table issues
. According to my data, there are nearly 90% ingested data that lost its custom fields
. Is there a bug?
Airbyte version: 0.40.26
JIRA Source Version: 0.3.1
Stuart Horgan
01/09/2023, 11:03 AMA redirect URI, or reply URL, is the location where the authorization server sends the user once the app has been successfully authorized and granted an authorization code or access token
how does that relate to the purpose of this airbyte connector fetching data into a warehouse? what would i be redirecting a user to?Luis Martinez
01/09/2023, 11:28 AMMudasir Mirza
01/09/2023, 1:18 PM2023-01-07 12:10:04 source > Syncing stream: issue_custom_field_contexts
2023-01-07 12:10:08 source > {"errorMessages":["The custom field was not found."],"errors":{}}
2023-01-07 12:10:08 source > Encountered an exception while reading stream issue_custom_field_contexts
Now this should only be causing Jira source to fail, even that should not as I have not selected this stream for sync.
Any idea how I can start debuggingMudasir Mirza
12/29/2022, 4:18 PMfaros
as a destination in airbyte
but literally nothing happens.
1. Added new faros
connector
2. I am configuring Faros as destination as shown in the screenshot and nothing happens and every time I go to destinations I am asked to add a destination
I am not sure how to proceed to next step as there is no button to either save
or next
Charan raj
01/02/2023, 7:51 AMAnya Fedotova
01/02/2023, 11:59 AMZaza Javakhishvili
01/03/2023, 7:21 PM-- 2023-01-02 19:10:20.191 <- service call/processing start datetime
-- 2023-01-02 19:45:03.546000 UTC <- airbyte min emitted datetime
-- 2023-01-02 21:00:10.844000 UTC <- airbyte max emitted datetime
-- 2023-01-02 21:05:01.717 <- service call/processing end datetime
Edgar Rafii Manzo
01/04/2023, 8:44 AMEsteban Palomeque
01/09/2023, 4:22 PM2023-01-09 15:50:07 [1;31mERROR[m i.a.s.a.ApiHelper(execute):28 - Unexpected Exception
java.lang.IllegalStateException: Duplicate key 206e4781-61fd-4f0c-ba8f-1a2cd209f920 (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@126b6c70[id=<null>,actorId=206e4781-61fd-4f0c-ba8f-1a2cd209f920,actorCatalogId=6345f8d9-8541-4f6f-9ab1-8706aea73482,configHash=<null>,connectorVersion=<null>,createdAt=1673273811] and io.airbyte.config.ActorCatalogFetchEvent@7ba6bea8[id=<null>,actorId=206e4781-61fd-4f0c-ba8f-1a2cd209f920,actorCatalogId=64c3d44c-0d0d-45c4-8b8e-f0e863bd047a,configHash=<null>,connectorVersion=<null>,createdAt=1673273811])
at java.util.stream.Collectors.duplicateKeyException(Collectors.java:135) ~[?:?]
please help me with this, I really don't know whats happening heretesfaye alemayehu
01/04/2023, 1:41 PMKeshav Gupta
01/05/2023, 7:26 AMMaxime Morelli
01/05/2023, 10:20 AMRenato Todorov
01/09/2023, 5:18 PMcheck_connection
calls are retuning a "upstream request timeout" error and I'm currently not able to understand where is it coming from because there are absolutely no errors in the logs, even with DEBUG enabled. I've logged an issue here, any help is appreciated: https://github.com/airbytehq/airbyte/issues/20963. Can anyone tell me how can I further debug this issue? I've exec'd into the server container and it can communicate and connect to all the other hosts (temporal and webapp, for example). I'm stuck and have no idea how to debug it further.Noor Thabit
01/09/2023, 6:39 PMMatheus Pinheiro
01/09/2023, 7:19 PMSiva Kowsika
01/09/2023, 8:12 PMSteven Wilber
01/09/2023, 9:07 PMOops! Something went wrong… Unknown error occurred
2023-01-09 210048 ERROR i.a.s.e.UncaughtExceptionMapper(toResponse):22 - Uncaught exception
airbyte-server | java.lang.IllegalStateException: Duplicate key 1ab0e414-c5d6-4920-a4dc-e5830965035a (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@1bf15df7[id=<null>,actorId=1ab0e414-c5d6-4920-a4dc-e5830965035a,actorCatalogId=c1d6532c-068f-427c-a6f5-7e8dd6bdd037,configHash=<null>,connectorVersion=<null>,createdAt=1673286105] and io.airbyte.config.ActorCatalogFetchEvent@476a0ab2[id=<null>,actorId=1ab0e414-c5d6-4920-a4dc-e5830965035a,actorCatalogId=3deaf88f-f287-405c-aa3c-dee60d5ce8a3,configHash=<null>,connectorVersion=<null>,createdAt=1673286105])I've no clue what to do to fix the issue - can anyone give me a clue please.
Christopher Wu
01/09/2023, 9:39 PMJason Maddern
01/10/2023, 5:29 AMrun --selector partner_staging
That then goes and:
• Pulls down my github repo and docker image
• Builds the image and dynamically creates the profiles.yml
file
The issue I'm facing is that when Airbyte auto builds the profiles.yml
from the airbyte destination config it seems to auto create output of prod
and I have no way of overriding it - it looks like this:
config:
partial_parse: true
printer_width: 120
send_anonymous_usage_stats: false
use_colors: true
normalize:
outputs:
prod:
account: {obfuscated}
client_session_keep_alive: false
connect_retries: 3
connect_timeout: 15
database: DEV_PP_AIRBYTE_DB
password: {obfuscated}
query_tag: normalization
retry_all: true
retry_on_database_errors: true
role: DEV_LOADER
schema: PP
threads: 5
type: snowflake
user: {obfuscated}
warehouse: DEV_LOADING
target: prod
This is problematic when defaulting to prod because I have a heap of jinja in dbt to handle prod/nonprod environments like this:
+database: |
{%- if "dev" in target.name -%} DEV_STUDYLINK_ANALYTICS
{%- elif "prod" in target.name -%} STUDYLINK_ANALYTICS
{%- else -%} invalid_staging_admit_database
{%- endif -%}
This isn't an issue if I run DBT myself, because I can have my own profiles output called dev
or prod
and inject the --target
on run. This allows the same code to run across prod/nonprod. But it's a problem for Airbyte as it seems to auto-create the fixed values of normalise
and prod
as the target.
How can we configure airbyte so that when it does DBT transformations it doesn't auto create the profiles with target prod
?
FYI @Varun KhannaChristoph Pirkl
01/10/2023, 7:07 AMTalha Asif
01/10/2023, 8:09 AMAlok Yermalkar
01/10/2023, 9:02 AMrepoURL: <https://airbytehq.github.io/helm-charts>
targetRevision: v0.43.18
chart: airbyte
Ron Handler
01/10/2023, 1:13 PM./gradlew :airbyte-webapp:assemble
and pushed to my registry.
Set the webapp.image.repository
, webapp.image.tag
accordingly
Once deployed with helm, the webapp pod enters a crash loop backoff with this in the logs:
/docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration
/docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/
/docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh
10-listen-on-ipv6-by-default.sh: info: Getting the checksum of /etc/nginx/conf.d/default.conf
10-listen-on-ipv6-by-default.sh: info: Enabled listen on IPv6 in /etc/nginx/conf.d/default.conf
/docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh
20-envsubst-on-templates.sh: Running envsubst on /etc/nginx/templates/default.conf.template to /etc/nginx/conf.d/default.conf
/docker-entrypoint.sh: Launching /docker-entrypoint.d/30-tune-worker-processes.sh
/docker-entrypoint.sh: Configuration complete; ready for start up
2023/01/10 12:57:19 [emerg] 1#1: host not found in upstream "$CONNECTOR_BUILDER_API_HOST" in /etc/nginx/conf.d/default.conf:6
nginx: [emerg] host not found in upstream "$CONNECTOR_BUILDER_API_HOST" in /etc/nginx/conf.d/default.conf:6
This seems very similar to https://github.com/airbytehq/airbyte/issues/19642 which I believe should already have been fixed in 0.40.22.
Will appreciate some help!