On Setting up a New Connection
Airbyte is stuck while loading required configuration parameters for my connector
Example of the issue:
To load configuration parameters, Airbyte must first
docker pull the connector's image, which may be many hundreds of megabytes. Under poor connectivity conditions, the request to pull the image may take a very long time or time out. More context on this issue can be found here. If your Internet speed is less than 30Mbps down or are running bandwidth-consuming workloads concurrently with Airbyte, you may encounter this issue. Run a speed test to verify your internet speed.
One workaround is to manually pull the latest version of every connector you'll use then resetting Airbyte. Note that this will remove any configured connections, sources, or destinations you currently have in Airbyte. To do this:
- Decide which connectors you'd like to use. For this example let's say you want the Postgres source and the Snowflake destination.
- Find the Docker image name of those connectors. Look here for sources and here for destinations. For each of the connectors you'd like to use, copy the value of the
dockerImageTagfields. For example, for the Postgres source this would be
- For each of the connectors you'd like to use, from your shell run
docker pull <repository>:<tag>, replacing
<tag>with the values copied from the step above e.g:
docker pull airbyte/source-postgres:0.1.6.
- Once you've finished downloading all the images, from the Airbyte repository root run
docker-compose down -vfollowed by
- The issue should be resolved.
Connection refused errors when connecting to a local db
Depending on your Docker network configuration, you may not be able to connect to
If you are running into connection refused errors when running Airbyte via Docker Compose on Mac, try using
host.docker.internal as the host. On Linux, you may have to modify
docker-compose.yml and add a host that maps to your local machine using
I don’t see a form when selecting a connector
We’ve had that issue once. (no spinner & 500 http error). We don’t know why. Resolution: try to stop airbyte (
docker-compose down) & restart (
Connection hangs when trying to run the discovery step
You receive the error below when you tried to sync a database with a lot of tables (6000 or more).
airbyte-scheduler | io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (<NUMBER> vs. 4194304)
The workaround for this is trying to transfer the tables you really want to use to another namespace. If you need all tables you should split them into separate namespaces and try to use two connections.