You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if i have a sql changeset file with 600 different changesets and then add 601st changeset i notice a lot of individual UPDATE sqls (looks like 1 per changeset to update latest time for the hash) are running against the internal liquibase databasechangelog table (in redshift,) if each one takes 300ms then the time adds up. for redshift can these updates be batched up into 1 update?
The text was updated successfully, but these errors were encountered:
@tooptoop4 right now there is no way to batch them. But it should only happen if you are upgrading from a version before 4.22 to any other after that - and only once. Is that your case?
if i have a sql changeset file with 600 different changesets and then add 601st changeset i notice a lot of individual UPDATE sqls (looks like 1 per changeset to update latest time for the hash) are running against the internal liquibase databasechangelog table (in redshift,) if each one takes 300ms then the time adds up. for redshift can these updates be batched up into 1 update?
The text was updated successfully, but these errors were encountered: