This page guides you through the process of setting up the Databend destination connector.
|Full Refresh Sync||Yes|
|Incremental - Append Sync||Yes|
Each stream will be output into its own table in Databend. Each table will contain 3 columns:
_airbyte_ab_id: a uuid assigned by Airbyte to each event that is processed. The column type in Databend is
_airbyte_emitted_at: a timestamp representing when the event was pulled from the data source. The column type in Databend is
_airbyte_data: a json blob representing with the event data. The column type in Databend is
Getting Started (Airbyte Cloud)
Getting Started (Airbyte Open-Source)
You can follow the Connecting to a Warehouse docs to get the user, password, host etc.
Or you can create such a user by running:
GRANT CREATE ON * TO airbyte_user;
Make sure the Databend user with the following permissions:
- can create tables and write rows.
- can create databases e.g:
You can also use a pre-existing user but we highly recommend creating a dedicated user for Airbyte.
You will need to choose an existing database or create a new database that will be used to store synced data from Airbyte.
Setup the Databend Destination in Airbyte
You should now have all the requirements needed to configure Databend as a destination in the UI. You'll need the following information to configure the Databend destination:
If your databend version >= v0.9.0 or later, you need to use databend-sqlalchemy version >= v0.1.0. And the Databend Cloud will only support databend version > 0.9.0.
|0.1.2||2023-02-11||22855||Fix compatibility issue with databend-query 0.9|
|0.1.1||2022-01-09||21182||Remove protocol option and enforce HTTPS|