Skip to content

Add SQL Server Profiler#2151

Open
goodwillpunning wants to merge 19 commits intomainfrom
feature/add_mssql_profiler
Open

Add SQL Server Profiler#2151
goodwillpunning wants to merge 19 commits intomainfrom
feature/add_mssql_profiler

Conversation

@goodwillpunning
Copy link
Contributor

Changes

What does this PR do?

This PR adds a new profiler for mssql.

Relevant implementation details

This profiler closely follows the implementation of the Azure Synapse profiler. However, this implementation provides a last_execution_time param for all mssql server queries, allowing for a future scheduler component to hook in an repeat queries.

Caveats/things to watch out for when reviewing:

This profiler will not support on-prem mssql servers.

Linked issues

N/A

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • added new profiler component
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

@goodwillpunning goodwillpunning requested a review from a team as a code owner November 17, 2025 16:20
@goodwillpunning goodwillpunning added the feat/profiler Issues related to profilers label Nov 17, 2025
@codecov
Copy link

codecov bot commented Nov 17, 2025

Codecov Report

❌ Patch coverage is 0% with 171 lines in your changes missing coverage. Please review.
✅ Project coverage is 13.47%. Comparing base (65ce810) to head (42d0028).
⚠️ Report is 2 commits behind head on main.

Files with missing lines Patch % Lines
...bridge/resources/assessments/mssql/info_extract.py 0.00% 71 Missing ⚠️
...ge/resources/assessments/mssql/activity_extract.py 0.00% 46 Missing ⚠️
...idge/resources/assessments/mssql/common/queries.py 0.00% 44 Missing ⚠️
...ge/resources/assessments/mssql/common/connector.py 0.00% 5 Missing ⚠️
...ge/resources/assessments/mssql/common/functions.py 0.00% 4 Missing ⚠️
...tabricks/labs/lakebridge/assessments/_constants.py 0.00% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #2151       +/-   ##
===========================================
- Coverage   65.23%   13.47%   -51.77%     
===========================================
  Files         100      105        +5     
  Lines        8504     8673      +169     
  Branches      875      875               
===========================================
- Hits         5548     1169     -4379     
- Misses       2769     7445     +4676     
+ Partials      187       59      -128     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions
Copy link

github-actions bot commented Nov 17, 2025

❌ 129/130 passed, 8 flaky, 1 failed, 5 skipped, 30m28s total

❌ test_recon_databricks_job_succeeds: TimeoutError: timed out after 0:20:00: (20m47.393s)
TimeoutError: timed out after 0:20:00:
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer catalog: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1770742838553, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c7y70emer', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c7y70emer', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026021019'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1770742838553, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:00 INFO [tests.integration.reconcile.conftest] Created catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c7y70emer', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c7y70emer.dummy_sdnkweryn', metastore_id=None, name='dummy_sdnkweryn', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:00 INFO [tests.integration.reconcile.conftest] Created schema dummy_sdnkweryn in catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_sdnkweryn volume: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', comment=None, created_at=1770742848084, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_sdnkweryn', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sdnkweryn', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sdnkweryn', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/69f8c236-2ce7-46f4-8455-e2ad2cf173ab', updated_at=1770742848084, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='69f8c236-2ce7-46f4-8455-e2ad2cf173ab', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tjlvpkjwm
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm', metastore_id=None, name='dummy_tjlvpkjwm', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tjlvpkjwm', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tq6jxx7gs
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs', metastore_id=None, name='dummy_tq6jxx7gs', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tq6jxx7gs', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tjlvpkjwm, dummy_tq6jxx7gs in schema dummy_sdnkweryn
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tjlvpkjwm and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tq6jxx7gs and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
17:00 DEBUG [databricks.labs.lakebridge.install] No existing version found in workspace; assuming fresh installation.
17:01 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21a4310cc9b155656ebd1e9e1
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21b861a7cb084e1ebca350cc4
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:01 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
17:01 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:01 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:01 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
[gw7] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer catalog: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1770742838553, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c7y70emer', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c7y70emer', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026021019'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1770742838553, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:00 INFO [tests.integration.reconcile.conftest] Created catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c7y70emer', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c7y70emer.dummy_sdnkweryn', metastore_id=None, name='dummy_sdnkweryn', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:00 INFO [tests.integration.reconcile.conftest] Created schema dummy_sdnkweryn in catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_sdnkweryn volume: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', comment=None, created_at=1770742848084, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_sdnkweryn', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sdnkweryn', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sdnkweryn', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/69f8c236-2ce7-46f4-8455-e2ad2cf173ab', updated_at=1770742848084, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='69f8c236-2ce7-46f4-8455-e2ad2cf173ab', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tjlvpkjwm
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm', metastore_id=None, name='dummy_tjlvpkjwm', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tjlvpkjwm', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tq6jxx7gs
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs', metastore_id=None, name='dummy_tq6jxx7gs', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tq6jxx7gs', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tjlvpkjwm, dummy_tq6jxx7gs in schema dummy_sdnkweryn
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tjlvpkjwm and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tq6jxx7gs and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
17:00 DEBUG [databricks.labs.lakebridge.install] No existing version found in workspace; assuming fresh installation.
17:01 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21a4310cc9b155656ebd1e9e1
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21b861a7cb084e1ebca350cc4
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:01 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
17:01 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:01 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:01 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
17:01 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:01 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `766135390220606`
17:01 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/766135390220606/runs/76236922761755` for the current status.
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Reconcile job run had 1 tasks
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Task run_reconciliation has error message: No output is available until the task begins.
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer catalog: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1770742838553, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c7y70emer', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c7y70emer', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026021019'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1770742838553, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
17:00 INFO [tests.integration.reconcile.conftest] Created catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c7y70emer', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c7y70emer.dummy_sdnkweryn', metastore_id=None, name='dummy_sdnkweryn', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:00 INFO [tests.integration.reconcile.conftest] Created schema dummy_sdnkweryn in catalog dummy_c7y70emer for recon tests
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_sdnkweryn volume: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_sdnkweryn
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', comment=None, created_at=1770742848084, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_sdnkweryn', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sdnkweryn', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sdnkweryn', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/69f8c236-2ce7-46f4-8455-e2ad2cf173ab', updated_at=1770742848084, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='69f8c236-2ce7-46f4-8455-e2ad2cf173ab', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tjlvpkjwm
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm', metastore_id=None, name='dummy_tjlvpkjwm', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tjlvpkjwm', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs schema: https://DATABRICKS_HOST/#explore/data/dummy_c7y70emer/dummy_sdnkweryn/dummy_tq6jxx7gs
17:00 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs', metastore_id=None, name='dummy_tq6jxx7gs', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tq6jxx7gs', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:00 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tjlvpkjwm, dummy_tq6jxx7gs in schema dummy_sdnkweryn
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tjlvpkjwm and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tq6jxx7gs and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
17:00 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
17:00 DEBUG [databricks.labs.lakebridge.install] No existing version found in workspace; assuming fresh installation.
17:01 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_c7y70emer.dummy_sdnkweryn
17:01 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21a4310cc9b155656ebd1e9e1
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
17:01 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f106a21b861a7cb084e1ebca350cc4
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
17:01 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026021019'})
17:01 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
17:01 INFO [databricks.labs.lakebridge.deployment.job] Creating new job configuration for job `Reconciliation Runner`
17:01 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/766135390220606
17:01 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
17:01 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
17:01 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
17:01 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `766135390220606`
17:01 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/766135390220606/runs/76236922761755` for the current status.
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Reconcile job run had 1 tasks
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Task run_reconciliation has error message: No output is available until the task begins.
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Tearing down application context for recon tests
17:21 INFO [databricks.labs.lakebridge.install] Uninstalling Lakebridge from https://DATABRICKS_HOST.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Uninstalling reconcile components.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Removing reconciliation dashboards.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Removing dashboard with id=01f106a21a4310cc9b155656ebd1e9e1.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Removing dashboard with id=01f106a21b861a7cb084e1ebca350cc4.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Removing Reconciliation Jobs.
17:21 INFO [databricks.labs.lakebridge.deployment.recon] Removing job Reconciliation Runner with job_id=766135390220606.
17:21 INFO [databricks.labs.lakebridge.install] Uninstallation completed successfully.
17:21 INFO [tests.integration.reconcile.test_recon_databricks] Application context teardown complete for recon tests
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 2 table fixtures
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tjlvpkjwm', metastore_id=None, name='dummy_tjlvpkjwm', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tjlvpkjwm', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_tq6jxx7gs', metastore_id=None, name='dummy_tq6jxx7gs', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026021019'}, row_filter=None, schema_name='dummy_sdnkweryn', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_sdnkweryn/dummy_tq6jxx7gs', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 volume fixtures
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] removing volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_c7y70emer', comment=None, created_at=1770742848084, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_c7y70emer.dummy_sdnkweryn.dummy_sdnkweryn', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_sdnkweryn', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_sdnkweryn', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/69f8c236-2ce7-46f4-8455-e2ad2cf173ab', updated_at=1770742848084, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='69f8c236-2ce7-46f4-8455-e2ad2cf173ab', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 schema fixtures
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_c7y70emer', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_c7y70emer.dummy_sdnkweryn', metastore_id=None, name='dummy_sdnkweryn', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 catalog fixtures
17:21 DEBUG [databricks.labs.pytester.fixtures.baseline] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1770742838553, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_c7y70emer', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_c7y70emer', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026021019'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1770742838553, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
[gw7] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Flaky tests:

  • 🤪 test_installs_and_runs_local_bladebridge (19.962s)
  • 🤪 test_installs_and_runs_pypi_bladebridge (28.055s)
  • 🤪 test_transpiles_informatica_to_sparksql_non_interactive[True] (14.515s)
  • 🤪 test_transpiles_informatica_to_sparksql_non_interactive[False] (14.567s)
  • 🤪 test_transpiles_informatica_to_sparksql (14.22s)
  • 🤪 test_transpile_teradata_sql_non_interactive[True] (16.08s)
  • 🤪 test_transpile_teradata_sql (5.682s)
  • 🤪 test_transpile_teradata_sql_non_interactive[False] (13.101s)

Running from acceptance #3666

from databricks.labs.lakebridge.connections.database_manager import DatabaseManager


def get_sqlserver_reader(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Database manager should already have the MSSQLConnector we should be using that.

steps:
- name: activity_extract
type: python
extract_source: src/databricks/labs/lakebridge/resources/assessments/mssql/activity_extract.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we just run sql queries directly? do we need python wrapper for SQL Server profilers?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat/profiler Issues related to profilers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants