Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update table-migration workflows to also capture updated migration progress into the history log #3239

Open
wants to merge 52 commits into
base: main
Choose a base branch
from

Conversation

asnare
Copy link
Contributor

@asnare asnare commented Nov 11, 2024

Changes

The table-migration workflows already contained tasks at the end that log information about tables that still need to be migrated. The primary purpose of this PR is to update these workflows so they also capture updated progress information into the history log.

Other changes include:

Linked issues

Conflicts with #3200 (will need rebasing). (Resolved.)

Functionality

  • updated documentation

  • modified existing workflows:

    • migrate-tables
    • migrate-external-hiveserde-tables-in-place-experimental
    • migrate-external-tables-ctas
    • scan-tables-in-mounts-experimental
    • migrate-tables-in-mounts-experimental

Tests

  • manually tested
  • updated and new unit tests
  • updated and new integration tests

@asnare asnare added enhancement New feature or request migrate/external go/uc/upgrade SYNC EXTERNAL TABLES step migrate/managed go/uc/upgrade Upgrade Managed Tables and Jobs migrate/jobs Step 5 - Upgrading Jobs for External Tables feat/migration-progress Issues related to the migration progress workflow labels Nov 11, 2024
@asnare asnare self-assigned this Nov 11, 2024
@asnare asnare requested a review from a team as a code owner November 11, 2024 12:03
Copy link

github-actions bot commented Nov 11, 2024

❌ 51/55 passed, 4 flaky, 4 failed, 4 skipped, 3h48m9s total

❌ test_hiveserde_table_in_place_migration_job[hiveserde]: TimeoutError: timed out after 0:20:00: (22m53.305s)
... (skipped 11484489 bytes)
(2 more bytes)",
<   "resource_id": "2281209757959743"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.48Ss/wheels/wheel-test-runner-0.52.1+5320241212164210&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.48Ss/wheels/databricks_labs_ucx-0.52.1+5320241212164210-py3-none-any.whl', '-t', '/tmp/ucx-w_smervm', '--upgrade']
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
< 200 OK
< {
<   "created_at": 1734021735813,
<   "language": "PYTHON",
<   "modified_at": 1734021735813,
<   "object_id": 2281209757959743,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959743"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/databricks_labs_ucx-0.52.1+5320241212164212-py3-none-any.whl', '-t', '/tmp/ucx-w_smervm', '--upgrade']
17:02 DEBUG [databricks.labs.ucx.source_code.python_libraries:assess_workflows] pip output:
Processing /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/databricks_labs_ucx-0.52.1+5320241212164301-py3-none-any.whl
Collecting astroid>=3.3.1 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached astroid-3.3.6-py3-none-any.whl.metadata (4.5 kB)
Collecting databricks-labs-blueprint<0.10,>=0.9.1 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached databricks_labs_blueprint-0.9.3-py3-none-any.whl.metadata (55 kB)
Collecting databricks-labs-lsql<0.15,>=0.14.0 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached databricks_labs_lsql-0.14.1-py3-none-any.whl.metadata (8.7 kB)
Collecting databricks-sdk<0.39,>=0.38 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached databricks_sdk-0.38.0-py3-none-any.whl.metadata (38 kB)
Collecting pyyaml<7.0.0,>=6.0.0 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting sqlglot<26.1,>=25.5.0 (from databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached sqlglot-26.0.0-py3-none-any.whl.metadata (19 kB)
Collecting requests<3,>=2.28.1 (from databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached requests-2.32.3-py3-none-any.whl.metadata (4.6 kB)
Collecting google-auth~=2.0 (from databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached google_auth-2.37.0-py2.py3-none-any.whl.metadata (4.8 kB)
Collecting cachetools<6.0,>=2.0.0 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached cachetools-5.5.0-py3-none-any.whl.metadata (5.3 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached pyasn1_modules-0.4.1-py3-none-any.whl.metadata (3.5 kB)
Collecting rsa<5,>=3.1.4 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached rsa-4.9-py3-none-any.whl.metadata (4.2 kB)
Collecting charset-normalizer<4,>=2 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (34 kB)
Collecting idna<4,>=2.5 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached idna-3.10-py3-none-any.whl.metadata (10 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached urllib3-2.2.3-py3-none-any.whl.metadata (6.5 kB)
Collecting certifi>=2017.4.17 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached certifi-2024.8.30-py3-none-any.whl.metadata (2.2 kB)
Collecting pyasn1<0.7.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164301)
  Using cached pyasn1-0.6.1-py3-none-any.whl.metadata (8.4 kB)
Using cached astroid-3.3.6-py3-none-any.whl (274 kB)
Using cached databricks_labs_blueprint-0.9.3-py3-none-any.whl (61 kB)
Using cached databricks_labs_lsql-0.14.1-py3-none-any.whl (47 kB)
Using cached databricks_sdk-0.38.0-py3-none-any.whl (575 kB)
Using cached PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (767 kB)
Using cached sqlglot-26.0.0-py3-none-any.whl (435 kB)
Using cached google_auth-2.37.0-py2.py3-none-any.whl (209 kB)
Using cached requests-2.32.3-py3-none-any.whl (64 kB)
Using cached cachetools-5.5.0-py3-none-any.whl (9.5 kB)
Using cached certifi-2024.8.30-py3-none-any.whl (167 kB)
Using cached charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (143 kB)
Using cached idna-3.10-py3-none-any.whl (70 kB)
Using cached pyasn1_modules-0.4.1-py3-none-any.whl (181 kB)
Using cached rsa-4.9-py3-none-any.whl (34 kB)
Using cached urllib3-2.2.3-py3-none-any.whl (126 kB)
Using cached pyasn1-0.6.1-py3-none-any.whl (83 kB)
Installing collected packages: urllib3, sqlglot, pyyaml, pyasn1, idna, charset-normalizer, certifi, cachetools, astroid, rsa, requests, pyasn1-modules, google-auth, databricks-sdk, databricks-labs-blueprint, databricks-labs-lsql, databricks-labs-ucx
Successfully installed astroid-3.3.6 cachetools-5.5.0 certifi-2024.8.30 charset-normalizer-3.4.0 databricks-labs-blueprint-0.9.3 databricks-labs-lsql-0.14.1 databricks-labs-ucx-0.52.1+5320241212164301 databricks-sdk-0.38.0 google-auth-2.37.0 idna-3.10 pyasn1-0.6.1 pyasn1-modules-0.4.1 pyyaml-6.0.2 requests-2.32.3 rsa-4.9 sqlglot-26.0.0 urllib3-2.2.3


17:02 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Unknown language for /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301
17:02 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Unknown language for /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/clusters/get?cluster_id=TEST_EXT_HMS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 16.0,
<   "cluster_id": "TEST_EXT_HMS_CLUSTER_ID",
<   "cluster_memory_mb": 65536,
<   "cluster_name": "External Metastore",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "data_security_mode": "USER_ISOLATION",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "TEST_EXT_HMS_CLUSTER_ID",
<     "ClusterName": "External Metastore",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.12",
<     "instance_id": "681e96f4623d47dcbae251ce41e1ae31",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "37fc637be71e4e94860b88fba9acda41",
<     "private_ip": "10.179.10.13",
<     "public_dns": "",
<     "start_timestamp": 1734021432296
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "executors": [
<     {
<       "host_private_ip": "10.179.8.13",
<       "instance_id": "90c2fdc58daa4f548a1f87807811ec87",
<       "node_attributes": {
<         "is_spot": false
<       },
<       "node_id": "a0bbf1832a6148278c078e03443f0910",
<       "private_ip": "10.179.10.12",
<       "public_dns": "",
<       "start_timestamp": 1734021432249
<     }
<   ],
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1734021703339,
<   "last_restarted_time": 1734021634288,
<   "last_state_loss_time": 1734021634224,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 1,
<   "pinned_by_user_name": "4183391249163402",
<   "policy_id": "001F96C2B8ED3FDE",
<   "spark_conf": {
<     "datanucleus.autoCreateSchema": "true",
<     "datanucleus.fixedDatastore": "true",
<     "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver",
<     "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}",
<     "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}",
<     "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}",
<     "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive",
<     "spark.sql.hive.metastore.jars": "maven",
<     "spark.sql.hive.metastore.version": "3.1.0"
<   },
<   "spark_context_id": 3059637370043175108,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "cluster_name": "External Metastore",
<     "data_security_mode": "USER_ISOLATION",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "effective_spark_version": "16.0.x-scala2.12",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 1,
<     "policy_id": "001F96C2B8ED3FDE",
<     "spark_conf": {
<       "datanucleus.autoCreateSchema": "true",
<       "datanucleus.fixedDatastore": "true",
<       "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver",
<       "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}",
<       "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}",
<       "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}",
<       "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive",
<       "spark.sql.hive.metastore.jars": "maven",
<       "spark.sql.hive.metastore.version": "3.1.0"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598211326,
<   "state": "RUNNING",
<   "state_message": ""
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/libraries/cluster-status?cluster_id=TEST_EXT_HMS_CLUSTER_ID
< 200 OK
< {
<   "cluster_id": "TEST_EXT_HMS_CLUSTER_ID"
< }
17:02 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering refresh_table_migration_status entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301
< 200 OK
< {
<   "created_at": 1734021785273,
<   "language": "PYTHON",
<   "modified_at": 1734021785273,
<   "object_id": 2281209757959889,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+53202412121643... (2 more bytes)",
<   "resource_id": "2281209757959889"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301
< 200 OK
< {
<   "created_at": 1734021785273,
<   "language": "PYTHON",
<   "modified_at": 1734021785273,
<   "object_id": 2281209757959889,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+53202412121643... (2 more bytes)",
<   "resource_id": "2281209757959889"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/wheel-test-runner-0.52.1+5320241212164301&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.QSrA/wheels/databricks_labs_ucx-0.52.1+5320241212164301-py3-none-any.whl', '-t', '/tmp/ucx-w_smervm', '--upgrade']
17:02 INFO [databricks.labs.ucx.installer.workflows] ------ END REMOTE LOGS (SO FAR) -----
17:02 INFO [databricks.labs.ucx.install] Deleting UCX v0.52.1+5320241212164210 from https://DATABRICKS_HOST
17:02 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_saujz
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=214042868486379, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=847032423129115, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=399218929136012, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=639526607017352, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=54706124652717, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=561410213770114, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=2020187535433, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=739292121785975, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1006717424930889, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=97133580469284, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=253653431206102, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=205831595904945, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=915373389397255, as it is no longer needed
17:03 INFO [databricks.labs.ucx.install] Deleting cluster policy
17:03 INFO [databricks.labs.ucx.install] Deleting secret scope
17:03 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_table_migration_job_refreshes_migration_status[hiveserde-migrate-external-tables-ctas]: AssertionError: Workflow failed: assessment (22m16.883s)
AssertionError: Workflow failed: assessment
assert False
[gw9] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
16:41 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/config.yml) doesn't exist.
16:41 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
16:41 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
16:41 INFO [databricks.labs.ucx.install] Fetching installations...
16:41 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA is corrupted. Skipping...
16:41 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
16:41 DEBUG [tests.integration.conftest] Waiting for clusters to start...
16:42 DEBUG [tests.integration.conftest] Waiting for clusters to start...
16:42 INFO [databricks.labs.ucx.install] Installing UCX v0.52.1+5320241212164212
16:42 INFO [databricks.labs.ucx.install] Creating ucx schemas...
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
16:42 INFO [databricks.labs.ucx.install] Creating dashboards...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/README for the next steps.
16:42 INFO [databricks.labs.ucx.progress.install] Installation completed successfully!
16:42 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/765729260060556
16:42 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/765729260060556/runs/947463741459107
16:42 DEBUG [databricks.labs.ucx.installer.workflows] Validating assessment workflow: https://DATABRICKS_HOST#job/765729260060556
16:42 INFO [databricks.labs.ucx.installer.workflows] Identified a run in progress waiting for run completion
16:41 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/config.yml) doesn't exist.
16:41 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
16:41 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
16:41 INFO [databricks.labs.ucx.install] Fetching installations...
16:41 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA is corrupted. Skipping...
16:41 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
16:41 DEBUG [tests.integration.conftest] Waiting for clusters to start...
16:42 DEBUG [tests.integration.conftest] Waiting for clusters to start...
16:42 INFO [databricks.labs.ucx.install] Installing UCX v0.52.1+5320241212164212
16:42 INFO [databricks.labs.ucx.install] Creating ucx schemas...
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
16:42 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
16:42 INFO [databricks.labs.ucx.install] Creating dashboards...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
16:42 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
16:42 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
16:42 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/README for the next steps.
16:42 INFO [databricks.labs.ucx.progress.install] Installation completed successfully!
16:42 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/765729260060556
16:42 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/765729260060556/runs/947463741459107
16:42 DEBUG [databricks.labs.ucx.installer.workflows] Validating assessment workflow: https://DATABRICKS_HOST#job/765729260060556
16:42 INFO [databricks.labs.ucx.installer.workflows] Identified a run in progress waiting for run completion
17:02 INFO [databricks.labs.ucx.install] Deleting UCX v0.52.1+5320241212164212 from https://DATABRICKS_HOST
17:02 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_svnsy
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=402764541915150, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1021787998413129, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=873254476550468, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=323108122356704, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=624685246592106, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=943823645196112, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=292055433670718, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=394612456282206, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=658839196738054, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=957027749464131, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=765729260060556, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=955933146925700, as it is no longer needed
17:02 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=141857887332708, as it is no longer needed
17:03 INFO [databricks.labs.ucx.install] Deleting cluster policy
17:03 INFO [databricks.labs.ucx.install] Deleting secret scope
17:03 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw9] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_hiveserde_table_ctas_migration_job[hiveserde]: TimeoutError: timed out after 0:20:00: (22m54.925s)
... (skipped 10229065 bytes)
essage": ""
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xp8S/wheels/wheel-test-runner-0.52.1+5320241212164208
< 200 OK
< {
<   "created_at": 1734021732268,
<   "language": "PYTHON",
<   "modified_at": 1734021732268,
<   "object_id": 2281209757959729,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xp8S/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959729"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/libraries/cluster-status?cluster_id=TEST_LEGACY_TABLE_ACL_CLUSTER_ID
< 200 OK
< {
<   "cluster_id": "TEST_LEGACY_TABLE_ACL_CLUSTER_ID"
< }
17:02 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering validate_groups_permissions entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+5320241212164212
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xp8S/wheels/wheel-test-runner-0.52.1+5320241212164208&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xp8S/wheels/databricks_labs_ucx-0.52.1+5320241212164208-py3-none-any.whl', '-t', '/tmp/ucx-j0krilm2', '--upgrade']
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+5320241212164212
< 200 OK
< {
<   "created_at": 1734021736171,
<   "language": "PYTHON",
<   "modified_at": 1734021736171,
<   "object_id": 2281209757959747,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959747"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+5320241212164212
< 200 OK
< {
<   "created_at": 1734021736171,
<   "language": "PYTHON",
<   "modified_at": 1734021736171,
<   "object_id": 2281209757959747,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959747"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/wheel-test-runner-0.52.1+5320241212164212&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.u51P/wheels/databricks_labs_ucx-0.52.1+5320241212164212-py3-none-any.whl', '-t', '/tmp/ucx-j0krilm2', '--upgrade']
17:02 DEBUG [databricks.labs.ucx.source_code.python_libraries:assess_workflows] pip output:
Processing /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/databricks_labs_ucx-0.52.1+5320241212164212-py3-none-any.whl
Collecting astroid>=3.3.1 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached astroid-3.3.6-py3-none-any.whl.metadata (4.5 kB)
Collecting databricks-labs-blueprint<0.10,>=0.9.1 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached databricks_labs_blueprint-0.9.3-py3-none-any.whl.metadata (55 kB)
Collecting databricks-labs-lsql<0.15,>=0.14.0 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached databricks_labs_lsql-0.14.1-py3-none-any.whl.metadata (8.7 kB)
Collecting databricks-sdk<0.39,>=0.38 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached databricks_sdk-0.38.0-py3-none-any.whl.metadata (38 kB)
Collecting pyyaml<7.0.0,>=6.0.0 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting sqlglot<26.1,>=25.5.0 (from databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached sqlglot-26.0.0-py3-none-any.whl.metadata (19 kB)
Collecting requests<3,>=2.28.1 (from databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached requests-2.32.3-py3-none-any.whl.metadata (4.6 kB)
Collecting google-auth~=2.0 (from databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached google_auth-2.37.0-py2.py3-none-any.whl.metadata (4.8 kB)
Collecting cachetools<6.0,>=2.0.0 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached cachetools-5.5.0-py3-none-any.whl.metadata (5.3 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached pyasn1_modules-0.4.1-py3-none-any.whl.metadata (3.5 kB)
Collecting rsa<5,>=3.1.4 (from google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached rsa-4.9-py3-none-any.whl.metadata (4.2 kB)
Collecting charset-normalizer<4,>=2 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (34 kB)
Collecting idna<4,>=2.5 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached idna-3.10-py3-none-any.whl.metadata (10 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached urllib3-2.2.3-py3-none-any.whl.metadata (6.5 kB)
Collecting certifi>=2017.4.17 (from requests<3,>=2.28.1->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached certifi-2024.8.30-py3-none-any.whl.metadata (2.2 kB)
Collecting pyasn1<0.7.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth~=2.0->databricks-sdk<0.39,>=0.38->databricks-labs-ucx==0.52.1+5320241212164212)
  Using cached pyasn1-0.6.1-py3-none-any.whl.metadata (8.4 kB)
Using cached astroid-3.3.6-py3-none-any.whl (274 kB)
Using cached databricks_labs_blueprint-0.9.3-py3-none-any.whl (61 kB)
Using cached databricks_labs_lsql-0.14.1-py3-none-any.whl (47 kB)
Using cached databricks_sdk-0.38.0-py3-none-any.whl (575 kB)
Using cached PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (767 kB)
Using cached sqlglot-26.0.0-py3-none-any.whl (435 kB)
Using cached google_auth-2.37.0-py2.py3-none-any.whl (209 kB)
Using cached requests-2.32.3-py3-none-any.whl (64 kB)
Using cached cachetools-5.5.0-py3-none-any.whl (9.5 kB)
Using cached certifi-2024.8.30-py3-none-any.whl (167 kB)
Using cached charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (143 kB)
Using cached idna-3.10-py3-none-any.whl (70 kB)
Using cached pyasn1_modules-0.4.1-py3-none-any.whl (181 kB)
Using cached rsa-4.9-py3-none-any.whl (34 kB)
Using cached urllib3-2.2.3-py3-none-any.whl (126 kB)
Using cached pyasn1-0.6.1-py3-none-any.whl (83 kB)
Installing collected packages: urllib3, sqlglot, pyyaml, pyasn1, idna, charset-normalizer, certifi, cachetools, astroid, rsa, requests, pyasn1-modules, google-auth, databricks-sdk, databricks-labs-blueprint, databricks-labs-lsql, databricks-labs-ucx
Successfully installed astroid-3.3.6 cachetools-5.5.0 certifi-2024.8.30 charset-normalizer-3.4.0 databricks-labs-blueprint-0.9.3 databricks-labs-lsql-0.14.1 databricks-labs-ucx-0.52.1+5320241212164212 databricks-sdk-0.38.0 google-auth-2.37.0 idna-3.10 pyasn1-0.6.1 pyasn1-modules-0.4.1 pyyaml-6.0.2 requests-2.32.3 rsa-4.9 sqlglot-26.0.0 urllib3-2.2.3


17:02 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting assess_clusters dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212>
17:02 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Unknown language for /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
17:02 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Unknown language for /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/clusters/get?cluster_id=TEST_DEFAULT_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "ON_DEMAND_AZURE",
<     "first_on_demand": 1,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 16.0,
<   "cluster_id": "TEST_DEFAULT_CLUSTER_ID",
<   "cluster_memory_mb": 65536,
<   "cluster_name": "DEFAULT Test Cluster (Single Node, No Isolation)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "TEST_DEFAULT_CLUSTER_ID",
<     "ClusterName": "DEFAULT Test Cluster (Single Node, No Isolation)",
<     "Creator": "[email protected]",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.10",
<     "instance_id": "ecd297b24e314db982774930d4908085",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "f4d5ad933eb4474a99a1bddbe6a3d37b",
<     "private_ip": "10.179.10.10",
<     "public_dns": "",
<     "start_timestamp": 1734020588690
<   },
<   "driver_healthy": true,
<   "driver_instance_source": {
<     "node_type_id": "Standard_D16as_v5"
<   },
<   "driver_node_type_id": "Standard_D16as_v5",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_source": {
<     "node_type_id": "Standard_D16as_v5"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1734020737255,
<   "last_restarted_time": 1734020652938,
<   "last_state_loss_time": 1734020652905,
<   "node_type_id": "Standard_D16as_v5",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "runtime_engine": "STANDARD",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 8606136480799558934,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {
<       "availability": "ON_DEMAND_AZURE",
<       "first_on_demand": 1,
<       "spot_bid_max_price": -1.0
<     },
<     "cluster_name": "DEFAULT Test Cluster (Single Node, No Isolation)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "driver_node_type_id": "Standard_D16as_v5",
<     "effective_spark_version": "16.0.x-scala2.12",
<     "enable_elastic_disk": true,
<     "enable_local_disk_encryption": false,
<     "node_type_id": "Standard_D16as_v5",
<     "num_workers": 0,
<     "runtime_engine": "STANDARD",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731597943821,
<   "state": "RUNNING",
<   "state_message": ""
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/libraries/cluster-status?cluster_id=TEST_DEFAULT_CLUSTER_ID
< 200 OK
< {
<   "cluster_id": "TEST_DEFAULT_CLUSTER_ID"
< }
17:02 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering assess_dashboards entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
< 200 OK
< {
<   "created_at": 1734021735813,
<   "language": "PYTHON",
<   "modified_at": 1734021735813,
<   "object_id": 2281209757959743,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959743"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212
< 200 OK
< {
<   "created_at": 1734021735813,
<   "language": "PYTHON",
<   "modified_at": 1734021735813,
<   "object_id": 2281209757959743,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+53202412121642... (2 more bytes)",
<   "resource_id": "2281209757959743"
< }
17:02 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/wheel-test-runner-0.52.1+5320241212164212&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
17:02 INFO [databricks.labs.ucx.framework.utils:assess_workflows] Invoking command: ['pip', '--disable-pip-version-check', 'install', '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.G1dA/wheels/databricks_labs_ucx-0.52.1+5320241212164212-py3-none-any.whl', '-t', '/tmp/ucx-j0krilm2', '--upgrade']
17:02 INFO [databricks.labs.ucx.installer.workflows] ------ END REMOTE LOGS (SO FAR) -----
17:02 INFO [databricks.labs.ucx.install] Deleting UCX v0.52.1+5320241212164214 from https://DATABRICKS_HOST
17:02 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_skwla
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=93516118578678, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=454267994355890, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=41464443407973, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=888952888634695, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1071276815099148, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=101391602916708, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=534018427391106, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=991564615225824, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=631205445908589, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=150005764763184, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=571749269517033, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=5317907626484, as it is no longer needed
17:03 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=146005027313261, as it is no longer needed
17:03 INFO [databricks.labs.ucx.install] Deleting cluster policy
17:03 INFO [databricks.labs.ucx.install] Deleting secret scope
17:03 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw7] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_table_migration_job_publishes_remaining_tables[regular]: AssertionError: Workflow failed: assessment (5m58.772s)
AssertionError: Workflow failed: assessment
assert False
 +  where False = validate_step('assessment')
 +    where validate_step = <databricks.labs.ucx.installer.workflows.DeployedWorkflows object at 0x7ff7c51b2f80>.validate_step
 +      where <databricks.labs.ucx.installer.workflows.DeployedWorkflows object at 0x7ff7c51b2f80> = <tests.integration.conftest.MockInstallationContext object at 0x7ff7bce7c100>.deployed_workflows
[gw5] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
17:04 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA/config.yml) doesn't exist.
17:04 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
17:04 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
17:04 INFO [databricks.labs.ucx.install] Fetching installations...
17:04 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA is corrupted. Skipping...
17:04 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
17:04 DEBUG [tests.integration.conftest] Waiting for clusters to start...
17:04 DEBUG [tests.integration.conftest] Waiting for clusters to start...
17:04 INFO [databricks.labs.ucx.install] Installing UCX v0.52.1+5320241212170420
17:04 INFO [databricks.labs.ucx.install] Creating ucx schemas...
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
17:04 INFO [databricks.labs.ucx.install] Creating dashboards...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA/README for the next steps.
17:05 INFO [databricks.labs.ucx.progress.install] Installation completed successfully!
17:05 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/300887686745978
17:05 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/300887686745978/runs/689300026187749
17:05 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/300887686745978/runs/689300026187749
17:08 INFO [databricks.labs.ucx.installer.workflows] Completed assessment job run 689300026187749 with state: RunResultState.CANCELED (Run cancelled by user)
17:08 INFO [databricks.labs.ucx.installer.workflows] Completed assessment job run 689300026187749 duration: 0:03:06.451000 (2024-12-12 17:05:45.795000+00:00 thru 2024-12-12 17:08:52.246000+00:00)
17:08 DEBUG [databricks.labs.ucx.installer.workflows] Validating assessment workflow: https://DATABRICKS_HOST#job/300887686745978
17:04 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA/config.yml) doesn't exist.
17:04 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
17:04 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
17:04 INFO [databricks.labs.ucx.install] Fetching installations...
17:04 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA is corrupted. Skipping...
17:04 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
17:04 DEBUG [tests.integration.conftest] Waiting for clusters to start...
17:04 DEBUG [tests.integration.conftest] Waiting for clusters to start...
17:04 INFO [databricks.labs.ucx.install] Installing UCX v0.52.1+5320241212170420
17:04 INFO [databricks.labs.ucx.install] Creating ucx schemas...
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
17:04 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
17:04 INFO [databricks.labs.ucx.install] Creating dashboards...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
17:04 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
17:04 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
17:04 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA/README for the next steps.
17:05 INFO [databricks.labs.ucx.progress.install] Installation completed successfully!
17:05 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/300887686745978
17:05 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/300887686745978/runs/689300026187749
17:05 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/300887686745978/runs/689300026187749
17:08 INFO [databricks.labs.ucx.installer.workflows] Completed assessment job run 689300026187749 with state: RunResultState.CANCELED (Run cancelled by user)
17:08 INFO [databricks.labs.ucx.installer.workflows] Completed assessment job run 689300026187749 duration: 0:03:06.451000 (2024-12-12 17:05:45.795000+00:00 thru 2024-12-12 17:08:52.246000+00:00)
17:08 DEBUG [databricks.labs.ucx.installer.workflows] Validating assessment workflow: https://DATABRICKS_HOST#job/300887686745978
17:08 INFO [databricks.labs.ucx.install] Deleting UCX v0.52.1+5320241212170420 from https://DATABRICKS_HOST
17:08 ERROR [databricks.labs.ucx.install] Check if /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Y9tA is present
[gw5] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Flaky tests:

  • 🤪 test_table_migration_job_refreshes_migration_status[hiveserde-migrate-external-hiveserde-tables-in-place-experimental] (23m15.078s)
  • 🤪 test_table_migration_for_managed_table[managed-migrate-tables] (23m5.46s)
  • 🤪 test_table_migration_job_refreshes_migration_status[regular-migrate-tables] (23m16.333s)
  • 🤪 test_migration_job_ext_hms[regular] (33m10.985s)

Running from acceptance #7725

src/databricks/labs/ucx/hive_metastore/workflows.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/hive_metastore/workflows.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/hive_metastore/workflows.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/progress/grants.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/progress/history.py Outdated Show resolved Hide resolved
src/databricks/labs/ucx/progress/tables.py Outdated Show resolved Hide resolved
tests/integration/hive_metastore/test_workflows.py Outdated Show resolved Hide resolved
tests/unit/hive_metastore/test_table_migrate.py Outdated Show resolved Hide resolved
It should return all the tables as a snapshot.
…ng the migration index is available during encoding.
There was a marginal benefit to ensuring the migration progress singleton could be initialized prior to loading the snapshot, but it wasn't really worth the eye-catching local.
@asnare asnare requested a review from JCZuurmond December 10, 2024 12:27
@asnare asnare removed the pr/do-not-merge this pull request is not ready to merge label Dec 10, 2024
@job_task(job_cluster="user_isolation")
def verify_progress_tracking_prerequisites(self, ctx: RuntimeContext) -> None:
"""Verify the prerequisites for running this job on the table migration cluster are fulfilled."""
ctx.verify_progress_tracking.verify(timeout=dt.timedelta(hours=1))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This forces the UCX catalog to be created before table migration, while it was not a pre-requiste before

github-merge-queue bot pushed a commit that referenced this pull request Dec 13, 2024
## Changes
Exclude TACL migration in table migration integration tests because
these were not asserted, and to speed up the tests and reduce flakiness

### Linked issues

Attempt to reduce flakiness blocking CI in #3239
Similar to #3437 in the sense that both PR scope integration tests to a
smaller set of resources

### Tests

- [x] modified integration tests
@JCZuurmond JCZuurmond removed their assignment Dec 16, 2024
@JCZuurmond
Copy link
Member

@asnare : When you are back, rebase with main and check if the CI still fails, the linked PR make the tests more robust

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feat/migration-progress Issues related to the migration progress workflow migrate/external go/uc/upgrade SYNC EXTERNAL TABLES step migrate/jobs Step 5 - Upgrading Jobs for External Tables migrate/managed go/uc/upgrade Upgrade Managed Tables and Jobs pr/do-not-merge this pull request is not ready to merge
Projects
Status: Blocked/Hold
Development

Successfully merging this pull request may close these issues.

4 participants