the airbyte airflow provider [2024-09-25, 08:07:3...
# ask-ai
y
the airbyte airflow provider [2024-09-25, 080737 UTC] {standard_task_runner.py:110} ERROR - Failed to execute job 84 for task import.import-CUSTOMER.trigger-sync (Invalid URL ‘airbyte-airbyte-server-svc.airbyte.svc.cluster.local/jobs’: No scheme supplied. Perhaps you meant https://airbyte-airbyte-server-svc.airbyte.svc.cluster.local/jobs?; 455) [2024-09-25, 080738 UTC] {local_task_job_runner.py:243} INFO - Task exited with return code 1
u
Based on the error message you provided, it seems like the URL for the Airbyte server in your Airflow task is missing the scheme (http or https). The error suggests that you might want to use
<https://airbyte-airbyte-server-svc.airbyte.svc.cluster.local/jobs>
instead of
airbyte-airbyte-server-svc.airbyte.svc.cluster.local/jobs
. The Airbyte operator in Apache Airflow allows you to trigger Airbyte OSS synchronization jobs from Apache Airflow. However, it's designed to work with the internal Config API rather than the newer Airbyte API and is not intended for orchestrating Airbyte Cloud. If you're using Airbyte Cloud, you might want to use Airflow's HTTP operators with the new Airbyte API instead. You can find more information about this in the [Airbyte Operator Guide](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator). If you're still encountering issues, I recommend reaching out to the Airbyte community on [Slack](https://slack.airbyte.com/) for further assistance.