dbt copy_partitions with job parallelization in Bigquery #1236
Unanswered
carolinabtt
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am creating dbt models with copy_partitions=true, but the problem is when I need to ingest a number of partitions with a large volume of data, the whole process can take a long time as dbt iterates by calling the Bigquery API for each partition.
Is there a way to parallelize those copy_partitions jobs in dbt?
I tried Threads configuration but does not work, since threads parallelize models executions and not API calls.
Beta Was this translation helpful? Give feedback.
All reactions