Skip to content

Release v0.7.5#229

Open
gueniai wants to merge 1 commit intomainfrom
prepare/0.7.5
Open

Release v0.7.5#229
gueniai wants to merge 1 commit intomainfrom
prepare/0.7.5

Conversation

@gueniai
Copy link
Collaborator

@gueniai gueniai commented Mar 23, 2026

  • Explicit support and testing against Python 3.13 and 3.14 (#228). The project's configuration and workflows have been updated to improve testing, dependency management, and support for the latest Python versions. Notably, the acceptance workflow now runs integration tests for draft pull requests, and the Hatch version has been upgraded to 1.16.5. The GitHub Actions workflow has been updated to use the latest versions of actions, including actions/checkout@v6 and actions/setup-python@v6, and the Python version has been upgraded to 3.14. Environment variables HATCH_VERBOSE and HATCH_VERSION have been added, with HATCH_VERBOSE set to 2 and HATCH_VERSION set to 1.16.5. Additionally, the project's tooling has been refreshed, including upgrades to mypy, pylint, and pytest, and the configuration for running tests has been updated to include integration tests and matrix testing across multiple Python versions. Various code updates have also been made, including changes to type hints, the removal of unnecessary type parameters, and the addition of a _not_none function to handle potential None values. These changes aim to enhance the project's testing and setup process, ensure better support for the latest Python versions and tools, and improve code readability and maintainability.
  • Fixed issue 216 - name is required in serving config (#224). The library has been updated to improve consistency and compatibility with the API behavior. The _make_workspace_client function now expects the scopes parameter to be a string, specifically "all-apis", to match the ClientCredentials signature. Additionally, the EndpointCoreConfigInput requires a name parameter, and the served_entities parameter is used instead of the deprecated served_models. The create function has been updated to use these new inputs and outputs, creating an endpoint with the required name and updated served_entities configuration, and returning a ServingEndpointDetailed object with a pending_config that includes both served_models and served_entities for compatibility. Furthermore, several test functions have been modified to access the entity_name and entity_version attributes of the served_entities list, ensuring the tests remain valid and verify the correct behavior of the make_serving_endpoint function with the new configuration.
  • [make_job] Allow passing serverless environments (#227). The make_job function has been enhanced to provide users with more control over the execution environment of their tasks when using serverless compute. A new environments keyword argument has been introduced, allowing users to optionally pass a list of serverless environments, which is required for running Spark Python tasks and can override the notebook environment for Databricks Notebook tasks. This argument can be used in conjunction with other existing keyword arguments to create a job with a specific set of tasks and environments. The update enables users to specify serverless environments for their jobs, providing more flexibility and control over their compute resources, and includes backward compatibility with existing functionality, as demonstrated by updated test cases that verify the correct handling of the environments argument.

* Explicit support and testing against Python 3.13 and 3.14 ([#228](#228)). The project's configuration and workflows have been updated to improve testing, dependency management, and support for the latest Python versions. Notably, the acceptance workflow now runs integration tests for draft pull requests, and the Hatch version has been upgraded to 1.16.5. The GitHub Actions workflow has been updated to use the latest versions of actions, including `actions/checkout@v6` and `actions/setup-python@v6`, and the Python version has been upgraded to 3.14. Environment variables `HATCH_VERBOSE` and `HATCH_VERSION` have been added, with `HATCH_VERBOSE` set to 2 and `HATCH_VERSION` set to 1.16.5. Additionally, the project's tooling has been refreshed, including upgrades to mypy, pylint, and pytest, and the configuration for running tests has been updated to include integration tests and matrix testing across multiple Python versions. Various code updates have also been made, including changes to type hints, the removal of unnecessary type parameters, and the addition of a `_not_none` function to handle potential `None` values. These changes aim to enhance the project's testing and setup process, ensure better support for the latest Python versions and tools, and improve code readability and maintainability.
* Fixed issue 216 - name is required in serving config ([#224](#224)). The library has been updated to improve consistency and compatibility with the API behavior. The `_make_workspace_client` function now expects the `scopes` parameter to be a string, specifically "all-apis", to match the `ClientCredentials` signature. Additionally, the `EndpointCoreConfigInput` requires a `name` parameter, and the `served_entities` parameter is used instead of the deprecated `served_models`. The `create` function has been updated to use these new inputs and outputs, creating an endpoint with the required `name` and updated `served_entities` configuration, and returning a `ServingEndpointDetailed` object with a `pending_config` that includes both `served_models` and `served_entities` for compatibility. Furthermore, several test functions have been modified to access the `entity_name` and `entity_version` attributes of the `served_entities` list, ensuring the tests remain valid and verify the correct behavior of the `make_serving_endpoint` function with the new configuration.
* [make_job] Allow passing serverless environments ([#227](#227)). The `make_job` function has been enhanced to provide users with more control over the execution environment of their tasks when using serverless compute. A new `environments` keyword argument has been introduced, allowing users to optionally pass a list of serverless environments, which is required for running Spark Python tasks and can override the notebook environment for Databricks Notebook tasks. This argument can be used in conjunction with other existing keyword arguments to create a job with a specific set of tasks and environments. The update enables users to specify serverless environments for their jobs, providing more flexibility and control over their compute resources, and includes backward compatibility with existing functionality, as demonstrated by updated test cases that verify the correct handling of the `environments` argument.
@gueniai gueniai requested a review from nfx as a code owner March 23, 2026 18:00
@github-actions
Copy link

✅ 40/40 passed, 6 skipped, 14m59s total

Running from acceptance #241

Copy link
Contributor

@asnare asnare left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sigh, the tests are all failing due to the SDK issue I mentioned in standup: Config from the SDK now makes a network call to enrich itself based on workspace metadata. For unit tests this fails because the host doesn't exist, and the initialiser goes into a fruitless retry-loop until the test timeout is hit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants