hi, is anyone able to sync large tables(like 10 M ...
# ask-community-for-troubleshooting
a
hi, is anyone able to sync large tables(like 10 M rows table to snowflake/Azure/GCS) from local. If yes, what would the right configuration. I am trying for past 2 days not able to move forward. I am using Postgres -> Snowflake(Via AWS S3 bucket stage) its running for 3 hours and failing at some point but nothing is loading but i can see the files in AWS S3 bucket. Below is my .env file
Copy code
# Relevant to scaling.
SYNC_JOB_MAX_ATTEMPTS=3
SYNC_JOB_MAX_TIMEOUT_DAYS=3
JOB_MAIN_CONTAINER_CPU_REQUEST=8
JOB_MAIN_CONTAINER_CPU_LIMIT=8
JOB_MAIN_CONTAINER_MEMORY_REQUEST= 100g
JOB_MAIN_CONTAINER_MEMORY_LIMIT= 100g


### LOGGING/MONITORING/TRACKING ###
TRACKING_STRATEGY=segment
JOB_ERROR_REPORTING_STRATEGY=logging
# Although not present as an env var, expected by Log4J configuration.
LOG_LEVEL=INFO


### APPLICATIONS ###
# Worker #
# Relevant to scaling.
MAX_SYNC_WORKERS=10
MAX_SPEC_WORKERS=10
MAX_CHECK_WORKERS=10
MAX_DISCOVER_WORKERS=10
# Temporal Activity configuration
ACTIVITY_MAX_ATTEMPT=
ACTIVITY_INITIAL_DELAY_BETWEEN_ATTEMPTS_SECONDS=
ACTIVITY_MAX_DELAY_BETWEEN_ATTEMPTS_SECONDS=
WORKFLOW_FAILURE_RESTART_DELAY_SECONDS=