<@U04SE163WC8> I want to add a query parameter “to...
# ask-ai
p
@kapa.ai I want to add a query parameter “to” as today. How to do this in builder
k
To add a query parameter "to" as today in the Airbyte UI builder, you can follow these steps: 1. In the "Incremental sync" section of the UI builder, set the "Cursor field" to the field that represents the date in your data source. 2. Set the "Datetime format" to match the format of the date in your data source. 3. Set the "End datetime" to "now". This will fetch all data up to the current date. 4. Enable the "Inject end time into outgoing HTTP request" option. 5. Set the "Inject Into" field to
request_parameter
. 6. In the "Field" input, enter
to
. This is the name of the query parameter that will hold the end time. Here is an example from the [Airbyte documentation](https://docs.airbyte.com/connector-development/connector-builder-ui/incremental-sync#example):
Copy code
markdown - "Cursor field" is set to `webPublicationDate` - "Datetime format" is set to `%Y-%m-%dT%H:%M:%SZ` - "End datetime" is set to "now" to fetch all articles up to the current date - "Inject end time into outgoing HTTP request" is set to `request_parameter` with "Field" set to `to-date`
In your case, you would replace
to-date
with
to
. Please note that the Connector Builder currently only supports key-value query params and request body parameters ([source](https://docs.airbyte.com/connector-development/connector-builder-ui/connector-builder-compatibility#are-the-required-parameters-of-the-integration-key-value-pairs)). If the integration requires query params or body parameters that are not key-value pairs, you may need to use the Python CDK.
p
@kapa.ai I have two query parameters “from” and “to” How to add “from” to incremental parameter
k
A message from kapa.ai
To add a query parameter "from" in the Airbyte UI builder, you can follow these steps: 1. In the "Incremental sync" section of the UI builder, set the "Cursor field" to the field that represents the date in your data source. 2. Set the "Datetime format" to match the format of the date in your data source. 3. Set the "Start datetime" to the initial start date of the time range to fetch records for. When doing incremental syncs, the second sync will overwrite this date with the last record that got synced so far. 4. Enable the "Inject start time into outgoing HTTP request" option. 5. Set the "Inject Into" field to
request_parameter
. 6. In the "Field" input, enter
from
. This is the name of the query parameter that will hold the start time. Here is an example from the [Airbyte documentation](https://docs.airbyte.com/connector-development/connector-builder-ui/incremental-sync#example):
Copy code
markdown - "Cursor field" is set to `webPublicationDate` - "Datetime format" is set to `%Y-%m-%dT%H:%M:%SZ` - "Start datetime" is set to "user input" to allow the user of the connector configuring a Source to specify the time to start syncing - "Inject start time into outgoing HTTP request" is set to `request_parameter` with "Field" set to `from-date`
In your case, you would replace
from-date
with
from
. Please note that the Connector Builder currently only supports key-value query params and request body parameters ([source](https://docs.airbyte.com/connector-development/connector-builder-ui/connector-builder-compatibility#are-the-required-parameters-of-the-integration-key-value-pairs)). If the integration requires query params or body parameters that are not key-value pairs, you may need to use the Python CDK.
p
Now_utc() is giving with timestamp, I only want date @kapa.ai
k
A message from kapa.ai
To get only the date from the
now_utc()
function, you can use the
strftime
method to format the output. Here's how you can do it:
Copy code
markdown '{{ now_utc().strftime('%Y-%m-%d') }}'
This will return the current date in the 'YYYY-MM-DD' format. For example, '2021-09-01'. This is based on the example provided in the [Airbyte documentation](https://docs.airbyte.com/connector-development/config-based/understanding-the-yaml-file/reference#macros).
p
@kapa.ai how to do transformation of columns from json response in buikder
k
A message from kapa.ai
In Airbyte, the transformation of columns from a JSON response is typically handled in the destination, after the data has been loaded. This is part of the ELT (Extract, Load, Transform) approach that Airbyte follows. The raw data is first loaded into the destination as a JSON blob and then it goes through a transformation stage. Here's a brief overview of how you can approach this: 1. Extract and Load: In the Airbyte UI builder, you can set up your source connector to extract data from your API and load it into your destination. The data from your source will be loaded into the
_airbyte_data
column in your destination as a JSON blob. 2. Transform: After the data has been loaded into your destination, you can run transformations on it. Airbyte supports transformations using dbt, SQL, or Airbyte's own basic normalization. You can learn more about these options in the [Airbyte documentation on transformations](https://docs.airbyte.com/operator-guides/transformation-and-normalization). For example, if you're using a destination that supports basic normalization (like Postgres, BigQuery, Redshift, or Snowflake), Airbyte can automatically convert the JSON blobs into a tabular format. You can enable this feature in the connection setup in the Airbyte UI. If you need more complex transformations, or if your destination doesn't support basic normalization, you can use dbt or SQL to transform the data. You would write your own dbt models or SQL queries to parse the JSON and transform it into the format you need. Please note that the transformation stage is not handled in the builder itself. The builder is used to set up the extraction and loading stages, and the transformation is handled separately in the destination. Here are some relevant links from the knowledge sources: - [Airbyte's approach to ELT](https://airbyte.com/tutorials/full-data-synchronization#step-1) - [Airbyte Protocol (JSON blobs)](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol#airbyterecordmessage) - [Basic Normalization](https://docs.airbyte.com/understanding-airbyte/basic-normalization) - [Transformations with dbt](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-dbt) - [Transformations with SQL](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-sql) - [Transformations with Airbyte](https://docs.airbyte.com/operator-guides/transformation-and-normalization/transformations-with-airbyte)