I am extracting data with special characters like ...
# ask-ai
l
I am extracting data with special characters like "á" or "ñ" but airbyte saves them incorrectly, such as "Ao63"
u
Airbyte has specific rules for handling special characters in data. When writing data, Airbyte converts any invalid characters into
_
characters. For example, in the case of BigQuery, datasets that begin with
_
are hidden on the BigQuery Explorer panel, so Airbyte prepends the namespace with
n
for converted namespaces [source](https://docs.airbyte.com/integrations/destinations/bigquery#bigquery-naming-conventions). In the case of JSON to Avro Conversion for Blob Storage Destinations, only alphanumeric characters and underscores (
/a-zA-Z0-9_/
) are allowed in a stream or field name. Any special character will be converted to an alphabet or underscore. For example,
spécial:character_names
will become
special_character_names
[source](https://docs.airbyte.com/understanding-airbyte/json-avro-conversion#filed-name). However, there have been issues reported with encoding problems, such as in Salesforce to BigQuery sync where special characters were not correctly recorded [source](https://discuss.airbyte.io/t/1790). There's also an open issue related to decoding non-ascii characters [source](https://discuss.airbyte.io/t/1757). If you're experiencing issues with special characters being saved incorrectly, it might be related to these known issues. You may want to check the specific destination documentation for any additional naming conventions or restrictions. If the problem persists, consider reporting it on the Airbyte Github or forum for further assistance.
l
How can I configure it so that they are saved with the original format in postgres