I apologize, but I don't have enough information in the provided knowledge sources to answer specifically how to pass Spark configs to the Databricks Lakehouse sink in Airbyte. The documents don't contain details about configuring Spark settings for this particular destination. The knowledge sources mainly cover general setup steps for the Databricks Lakehouse destination in Airbyte, like configuring the server hostname, HTTP path, authentication, and other basic options. They don't mention anything about passing custom Spark configurations. If you need specific information about Spark config options for the Databricks Lakehouse destination, I'd recommend checking the latest Airbyte documentation or reaching out to Airbyte support for more detailed guidance.