Ritu Wadhwani
09/26/2024, 1:37 PMuser
09/26/2024, 1:38 PMRitu Wadhwani
09/26/2024, 1:51 PMuser
09/26/2024, 1:51 PMuser
09/26/2024, 1:52 PMdocker build . -t my-custom-connectors/source-custom:0.1.0
2. Create your image tag with docker tag
command. The structure of the remote tag depends on your cloud provider's container registry service. 3. Use docker push <image-name>:<tag>
to push the image to your private Docker registry. You can then use this custom connector in your Airbyte instance by specifying the Docker image name of your custom connector when you set up a new source or destination. For more detailed instructions on creating a custom connector, you can refer to the [Airbyte documentation](https://docs.airbyte.com/connector-development/connector-builder-ui/overview). Please note that you'll need to maintain your custom connector separately from the Airbyte repository. This means that you'll need to manually merge any updates from the original OneDrive connector into your custom connector.Ritu Wadhwani
09/26/2024, 3:52 PMuser
09/26/2024, 3:52 PMuser
09/26/2024, 3:53 PMairbyte-integrations/connectors/destination-<name>
): bash ./gradlew :airbyte-integrations:connectors:destination-<name>:build
Replace <name>
with the name of your connector. This command builds a Docker image with your connector's code and tags it with the dev
tag. Please note that each time you make a change to your implementation, you need to re-build the connector image using the above command. This is because the Docker image contains a snapshot of your connector's code at the time the image was built. Any changes you make to the code after building the image will not be included in the image unless you rebuild it. For more information, you can refer to the [Airbyte documentation](https://docs.airbyte.com/connector-development/tutorials/building-a-java-destination#directly-running-the-destination-using-docker).Ritu Wadhwani
09/26/2024, 4:06 PMuser
09/26/2024, 4:06 PMuser
09/26/2024, 4:06 PMRitu Wadhwani
09/26/2024, 5:29 PMuser
09/26/2024, 5:29 PMuser
09/26/2024, 5:30 PMmain.py
of the source-microsoft-onedrive
connector is the entry point for the connector and it gets called when you run the connector. When you set up a new source in Airbyte and choose Microsoft OneDrive, Airbyte internally runs the main.py
script of the source-microsoft-onedrive
connector. This script handles the command line arguments and invokes the appropriate function based on the command you're running (like check
, discover
, read
etc.). For example, if you're running the check
command to check the connection to the Microsoft OneDrive, Airbyte will internally execute something like python main.py check --config config.json
. Here, main.py
is the script being run, check
is the command, and --config config.json
is an argument specifying the configuration file. You can also manually run the main.py
script for debugging or development purposes. For example, you can run python main.py spec
to output the JSON schema for the connector's configuration. For more information, you can refer to the [Airbyte documentation](https://docs.airbyte.com/connector-development/tutorials/the-hard-way/build-a-connector-the-hard-way#implement-the-spec-operation) and the [source code of the connector](https://docs.airbyte.com/integrations/sources/microsoft-onedrive).Ritu Wadhwani
09/26/2024, 5:31 PMuser
09/26/2024, 5:31 PMuser
09/26/2024, 5:31 PM/sync
endpoint. This is handled by the syncConnection
function in the ConnectionApiController
class in the Airbyte Server. Here is the relevant code snippet from the [knowledge source](https://github.com/airbytehq/airbyte/discussions/33811): java @Override @Post(uri = "/sync") @Secured({EDITOR}) @SecuredWorkspace @ExecuteOn(AirbyteTaskExecutors.SCHEDULER) public JobInfoRead syncConnection(@Body final ConnectionIdRequestBody connectionIdRequestBody) { return ApiHelper.execute(() -> schedulerHandler.syncConnection(connectionIdRequestBody)); }
This function takes a ConnectionIdRequestBody
object as input, which contains the ID of the connection to be synced. The function then calls the syncConnection
method of the SchedulerHandler
class, which starts the sync process. Please note that the actual data extraction is performed by the specific source connector for the connection being synced. The source connector is a separate component that is responsible for connecting to the data source and extracting data. The specific file or function within the source connector that is called during the sync process depends on the implementation of the connector.Ritu Wadhwani
09/26/2024, 5:55 PMuser
09/26/2024, 5:55 PMuser
09/26/2024, 5:55 PMRitu Wadhwani
09/26/2024, 5:57 PMuser
09/26/2024, 5:57 PMuser
09/26/2024, 5:57 PMRitu Wadhwani
09/26/2024, 6:00 PMuser
09/26/2024, 6:00 PMuser
09/26/2024, 6:00 PMRitu Wadhwani
09/26/2024, 6:09 PMuser
09/26/2024, 6:09 PMuser
09/26/2024, 6:09 PMRitu Wadhwani
09/26/2024, 6:44 PMuser
09/26/2024, 6:44 PMuser
09/26/2024, 6:44 PMdiscover
command for the source-microsoft-onedrive
connector. This command is used to discover the schema for the source. Here's a breakdown of what's happening: 1. Airbyte checks if the Docker image for the source-microsoft-onedrive
connector exists locally. 2. The Docker image is found locally. 3. Airbyte creates a Docker container for the discover
command. 4. Airbyte prepares the command to run the Docker container. This command includes various environment variables and arguments. 5. The Docker container is run with the discover
command and the source_config.json
configuration file. 6. Airbyte reads messages from the connector using protocol version 0.2.0. 7. The discover
command finishes running. If you're having issues with the discover
command, you might want to check the source_config.json
file to make sure it's correctly configured. If you're still having issues, you might want to ask for help in the [Airbyte community](https://discuss.airbyte.io/).