Skip to content

Update Switch documentation for informatica and custom_etl#2310

Open
yyoli-db wants to merge 1 commit intomainfrom
switch_doc_informatica_support
Open

Update Switch documentation for informatica and custom_etl#2310
yyoli-db wants to merge 1 commit intomainfrom
switch_doc_informatica_support

Conversation

@yyoli-db
Copy link
Contributor

@yyoli-db yyoli-db commented Feb 25, 2026

What does this PR do?

Update switch documentation for https://github.com/databrickslabs/switch/pull/68

Relevant implementation details

Caveats/things to watch out for when reviewing:

Linked issues

Resolves #..

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

@codecov
Copy link

codecov bot commented Feb 25, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 66.44%. Comparing base (87c4e12) to head (e71210e).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2310   +/-   ##
=======================================
  Coverage   66.44%   66.44%           
=======================================
  Files          99       99           
  Lines        9090     9090           
  Branches      974      974           
=======================================
  Hits         6040     6040           
  Misses       2874     2874           
  Partials      176      176           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions
Copy link

✅ 143/143 passed, 7 flaky, 5 skipped, 31m45s total

Flaky tests:

  • 🤪 test_installs_and_runs_local_bladebridge (20.359s)
  • 🤪 test_installs_and_runs_pypi_bladebridge (26.168s)
  • 🤪 test_transpiles_informatica_to_sparksql_non_interactive[False] (18.703s)
  • 🤪 test_transpile_teradata_sql_non_interactive[False] (20.187s)
  • 🤪 test_transpile_teradata_sql_non_interactive[True] (22.644s)
  • 🤪 test_transpile_teradata_sql (6.551s)
  • 🤪 test_transpiles_informatica_to_sparksql_non_interactive[True] (3.848s)

Running from acceptance #3936

sdp_language: "python"
```

Next, edit `custom_etl.yml` (located in `.lakebridge/switch/resources/builtin_prompts/` in your workspace root) to include your custom instructions. You can use other prompt yaml files as examples.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm assuming users would replace the placeholders ({ADD_YOUR_CUSTOM_DEFINITION_HERE}, {ADD_YOUR_CUSTOM_INSTRUCTIONS_HERE}, {ADD_YOUR_CUSTOM_INPUT_HERE}, {ADD_YOUR_CUSTOM_OUTPUT_HERE}) with their own content? If so, it'd be good to mention that explicitly. The current wording just says to edit custom_etl.yml but doesn't call out which placeholders to replace.

### Example 5: Informatica ETL to Databricks SDP in SQL

Convert PySpark ETL workload to Databricks Spark Declarative Pipeline in SQL:
Convert Informatica ETL workload to Databricks Spark Declarative Pipeline in SQL:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to confirm: Could you share the reasoning behind replacing the PySpark ETL example with Informatica? I'm guessing it was to keep the number of examples manageable?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants