Releases: databrickslabs/pytester
Releases · databrickslabs/pytester
v0.2.4
- Fixed PyPI metadata (#54). In this commit, the PyPI metadata for the pytester project has been updated with the new repository location at https://github.com/databrickslabs/pytester. The URLs for issues and source have been changed to point to the new repository, with the
issuesURL now directing to https://github.com/databrickslabs/pytester/issues and thesourceURL to https://github.com/databrickslabs/pytester. Furthermore, the versioning toolhatchhas been configured to manage the version number in the "src/databricks/labs/pytester/about.py" file. This ensures accurate and consistent versioning for the pytester project moving forward. - Improve
make_group/make_acc_groupfixture consistency (#50). This PR introduces improvements to themake_groupandmake_acc_groupfixtures, designed for managing Databricks workspace groups. The enhancements include a double-check approach to ensure group visibility by requiring the group to be retrievable via both.get()and.list()calls. This mitigates, but does not entirely eliminate, consistency issues with the APIs used for managing groups. Thewait_for_provisioningargument has been removed and replaced with an internal wait mechanism. The argument is still accepted but triggers a deprecation warning. Internal unit-test plumbing has been updated to use mock fixtures tailored for each test, ensuring double-check implementation testability. New and updated unit tests are included in thetest_iam.pyfile, along with the introduction of the_setup_groups_apifunction, which mocks specific clients to ensure group visibility when created. These changes improve consistency and reliability when working with Databricks workspace groups, making it easier for users to adopt the project.
v0.2.3
- Support providing name in
make_catalogfixture (#52). Themake_catalogfixture in our open-source library has been updated to allow users to specify a name for the catalog using a newnameparameter. Previously, the catalog was given a random name, but now users can have more control and customization over catalog names in their tests. This change includes updates to the docstring and the addition of unit tests to ensure the fixture behaves as expected with the new parameter. Additionally, the underlyingcall_statefulfunction was updated to expect a callable that returns a generator of callables, enabling the support for providing a name. Thetest_make_catalog_creates_catalog_with_nameandtest_make_catalogtests have been added to verify the behavior of the fixture with the newnameparameter.
Contributors: @asnare, @JCZuurmond
v0.2.2
- Use watchdog timeout to catalog properties (#48). This pull request introduces a new
RemoveAfterproperty for catalogs, which allows for marking them for skipping by the watchdog. This change addresses the current implementation gap, which does not explicitly indicate when catalogs are being used. The new property will specify the time from which objects can be purged. A corresponding fixturewatchdog_remove_afterhas been added to the list of available fixtures, and themake_catalogfixture has been updated to include this new property. Additionally, a timeout mechanism for catalogs has been implemented, which improves the system's efficiency and safety by marking catalogs as in use. A test for themake_catalogfunction has been included to ensure that theRemoveAfterentry is correctly added to the catalog properties. However, the specific call parameters for thecatalogs.createmethod cannot be accurately determined in the test. - use tags instead of name suffix for queries (#47). This release introduces updates to the testing library for Databricks, enhancing the naming conventions for queries to improve readability and comprehension. The previous implementation used name suffixes, which have been replaced with watchdog query tags. The
watchdog_purge_suffixfixture has been renamed towatchdog_remove_after, and the newmake_queryfixture has been added to the documentation. In addition, themake_queryandcreatefunctions now accept an optionaltagsargument, and the query name is generated with a unique identifier. Iftagsare provided, theRemoveAftertag is added. Theoriginal_query_tagis no longer hardcoded in thecreatefunction and has been removed. These changes improve the overall user experience and maintainability of the project.
Contributors: @ericvergnaud
v0.2.1
- Moved remaining UCX integration tests and fixtures (#45). In this release, we have made significant changes to the UCX integration tests and fixtures, as indicated by multiple commit messages. Firstly, we have moved remaining UCX integration tests and fixtures, introducing a new PyTest fixture called
Installationin the README.md file, providing instructions on how to adddatabricks-labs-pytesteras a test-time dependency when usinghatchas the build system. Additionally, we have added themake_feature_tablefixture, which creates a Databricks feature table and cleans it up after the test, taking optional parameters for customization. We have also modified themypyconfiguration in thepyproject.tomlfile to allow untyped imports during the type-checking process. In thecompute.pyfile, we have updated themake_jobfixture to return a function that creates adatabricks.sdk.service.jobs.Jobinstance, and modified thecreatefunction to return thedatabricks.sdk.service.jobs.Jobinstance directly. We have also added a new fixture calledmake_feature_tablein the plugin file, which simulates the lifecycle of a feature table in the machine learning service, with functions to generate a unique name and create/remove the feature table. In thetest_catalog.pyfile, we have made changes to clean up the file and ensure proper logging of test events and errors. Overall, these changes aim to refactor, expand functionality, and improve user-friendliness for the adopters of the project, ensuring proper logging and debugging capabilities. - [internal] port over existing UCX integration tests (#44). Three new integration tests have been added to the UCX project to verify the functionality of the
RemoveAfterproperty for tables and schemas. Thetest_remove_after_property_tableandtest_remove_after_property_schematests create new tables and schemas, respectively, and check if theRemoveAfterproperty is included in their properties. However, these tests are still marked asTODOdue to existing issues with thetables.getandschemas.getfunctions. In addition, existing UCX integration tests have been ported over, which include new functions for testing the removal of resources based on theRemoveAftertag. These tests are located in thetests/integration/fixtures/test_compute.pyfile and test the removal of various types of resources, including jobs, clusters, warehouses, and instance pools. The tests ensure that the time until purge is less than theTEST_RESOURCE_PURGE_TIMEOUTvalue plus one hour and import thedatetimemodule and theTEST_RESOURCE_PURGE_TIMEOUTconstant from thewatchdogfixture, as well as thelogginganddatabricks.sdk.service.iammodules.
Contributors: @nfx
v0.2.0
- Added
accandmake_acc_groupfixtures (#42). In this release, we have added two new fixtures,accandmake_acc_group, to the open-source library. Theaccfixture provides a Databricks AccountClient object for use in tests, which can interact with the Databricks account API and automatically determines the account host from theDATABRICKS_HOSTenvironment variable. Themake_acc_groupfixture is used for managing Databricks account groups, creating them with specified members and roles, and automatically deleting them after the test is complete. This fixture mirrors the behavior of themake_groupfixture but interacts with the account client instead of the workspace client. These fixtures enable more comprehensive integration tests for theaccobject and its various methods, enhancing the testing and management of Databricks account groups.
Contributors: @nfx
v0.1.1
- Fixed nightly CI builds (#40). In this release, we have removed the
no-cheatGitHub Actions workflow that checked for disables pylint directives in new code. We have also updated the pytest requirement version to ~8.3.3 and added badges for Python version support and lines of code to the README file. Thepermissions.pyfile in thedatabricks/labs/pytester/fixturesdirectory has been updated to fix nightly CI builds by improving import statements and updating types. TheSqlPermissionLevelclass has been imported from thedatabricks.sdk.service.sqlmodule, and an existing test case has been updated to use this new permission level for SQL-specific queries. Additionally, we have updated the version constraints for three dependencies in thepyproject.tomlfile to allow for more flexibility in selecting compatible library versions. These changes may simplify the project's GitHub Actions workflows, reduce maintenance overhead, and enhance the testing process and code quality.
Contributors: @nfx
v0.1.0
- Added Databricks Connect fixture. A new fixture named
sparkhas been added to the codebase, providing a Databricks Connect Spark session for testing purposes. The fixture requires thedatabricks-connectpackage to be installed and takes aWorkspaceClientobject as an argument. It first checks if acluster_idis present in the environment, and if not, it skips the test and raises a message. The fixture then ensures that the cluster is running and attempts to import theDatabricksSessionclass from thedatabricks.connectmodule. If the import fails, it skips the test and raises a message. This new fixture enables easier testing of Databricks Connect functionality, reducing boilerplate code required to set up a Spark session within tests. Additionally, a newis_in_debugfixture has been added, although there is no further documentation or usage examples provided for it. - Added
make_*_permissionsfixtures. In this release, we have added new fixtures to the pytester plugin for managing permissions in Databricks. These fixtures includemake_alert_permissions,make_authorization_permissions,make_cluster_permissions,make_cluster_policy_permissions,make_dashboard_permissions,make_directory_permissions,make_instance_pool_permissions,make_job_permissions,make_notebook_permissions,make_pipeline_permissions,make_query_permissions,make_registered_model_permissions,make_repository_permissions,make_serving_endpoint_permissions,make_warehouse_permissions,make_workspace_file_permissions, andmake_workspace_file_path_permissions. These fixtures allow for easier testing of functionality that requires managing permissions in Databricks, and are used for managing permissions for various Databricks resources such as alerts, authorization, clusters, cluster policies, dashboards, directories, instance pools, jobs, notebooks, pipelines, queries, registered models, repositories, serving endpoints, warehouses, and workspace files. Additionally, a newmake_notebook_permissionsfixture has been introduced in thetest_permissions.pyfile for integration tests, which allows for more comprehensive testing of the IAM system's behavior when handling notebook permissions. - Added
make_catalogfixture. A new fixture,make_catalog, has been added to the codebase to facilitate testing with specific catalogs, ensuring isolation and reproducibility. This fixture creates a catalog, returns its information, and removes the catalog after the test is complete. It can be used in conjunction with other fixtures such asws,sql_backend, andmake_random. The fixture is utilized in the updatedtest_catalog_fixtureintegration test function, which now includes new argumentsmake_catalog,make_schema, andmake_table. These fixtures create catalog, schema, and table objects, enabling more comprehensive testing of the catalog, schema, and table creation functionality. Please note that catalogs created using this fixture are not currently protected from being deleted by the watchdog. - Added
make_catalog,make_schema, andmake_tablefixtures (#33). In this release, we have updated thedatabricks-labs-blueprintpackage dependency todatabricks-labs-lsql~=0.10and added several fixtures to the codebase to improve the reliability and maintainability of the test suite. We have introduced three new fixturesmake_catalog,make_schema, andmake_tablethat are used for creating and managing test catalogs, schemas, and tables, respectively. These fixtures enable the creation of arbitrary test data and simplify testing by allowing predictable and consistent setup and teardown of test data for integration tests. Additionally, we have added several debugging fixtures, includingdebug_env_name,debug_env,env_or_skip, andsql_backend, to aid in testing DataBricks features related to SQL, environments, and more. Themake_udffixture has also been added for testing user-defined functions in DataBricks. These new fixtures and methods will assist in testing the project's functionality and ensure that the code is working as intended, making the tests more maintainable and easier to understand. - Added
make_clusterdocumentation. Themake_clusterfixture has been updated with new functionality and improvements. It now creates a Databricks cluster with specified configurations, waits for it to start, and cleans it up after the test, returning a function to create clusters. Thecluster_idattribute is accessible from the returned object. The fixture accepts several keyword arguments:single_nodeto create a single-node cluster,cluster_nameto specify a cluster name,spark_versionto set the Spark version, andautotermination_minutesto determine when the cluster should be automatically terminated. Thewsandmake_randomparameters have been removed. The commit also introduces a new test function,test_cluster, that creates a single-node cluster and outputs a message indicating the creation. Documentation for themake_clusterfunction has been added, and themake_cluster_policyfunction remains unchanged. - Added
make_experimentfixture. In this release, we introduce themake_experimentfixture in thedatabricks.labs.pytester.fixtures.mlmodule, facilitating the creation and cleanup of Databricks Experiments for testing purposes. This fixture accepts optionalpathandexperiment_nameparameters and returns adatabricks.sdk.service.ml.CreateExperimentResponseobject. Additionally,make_experiment_permissionshas been added for managing experiment permissions. In thepermissions.pyfile, the_make_permissions_factoryfunction replaces the previous_make_redash_permissions_factory, enhancing the code's maintainability and extensibility. Furthermore, amake_experimentfixture has been added to theplugin.pyfile for creating experiments with custom names and descriptions. Lastly, atest_experimentsfunction has been included in thetests/integration/fixturesdirectory, utilizingmake_group,make_experiment, andmake_experiment_permissionsfixtures to create experiments and assign group permissions. - Added
make_instance_pooldocumentation. In this release, themake_instance_poolfixture has been updated with added documentation, and the usage example has been slightly modified. The fixture now accepts optional keyword arguments for the instance pool name and node type ID, with default values set for each. Themake_randomfixture is still required for generating unique names. Additionally, a new function,log_workspace_link, has been updated to accept a new parameteranchorfor controlling the inclusion of an anchor (#) in the generated URL. New test functionstest_instance_poolandtest_cluster_policyhave been added to enhance the integration testing of the compute system, providing more comprehensive coverage for instance pools and cluster policies. Furthermore, documentation has been added for themake_instance_poolfixture. Lastly, three test functions,test_cluster,test_instance_pool, andtest_job, have been removed, but the setup functions for these tests are retained, indicating a possible streamlining of the codebase. - Added
make_jobdocumentation. Themake_jobfixture has been updated with additional arguments and improved documentation. It now acceptsnotebook_path,name,spark_conf, andlibrariesas optional keyword arguments, and can accept any additional arguments to be passed to theWorkspaceClient.jobs.createmethod. If nonotebook_pathortasksargument is provided, a random notebook is created and a single task with a notebook task is run using the latest Spark version and a single worker cluster. The fixture has been improved to manage Databricks jobs and clean them up after testing. Additionally, documentation has been added for themake_jobfunction and thetest_jobfunction in the test fixtures file. Thetest_jobfunction, which created a job and logged its creation, has been removed, and thetest_clusterandtest_pipelinefunctions remain unchanged. Theosmodule is no longer imported in this file. - Added
make_modelfixture. A new pytest fixture,make_model, has been added to the codebase for the open-source library. This fixture facilitates the creation and automatic cleanup of Databricks Models during tests, returning aGetModelResponseobject. The optionalmodel_nameparameter allows for customization, with a default value ofdummy-*. Themake_modelfixture can be utilized in conjunction with other fixtures such asws,make_random, andmake_registered_model_permissions, streamlining the testing of model-related functionality. Additionally, a new test function,test_models, has been introduced, utilizingmake_model,make_group, andmake_registered_model_permissionsfixtures to test model management within the system. This new feature enhances the library's testing capabilities, making it easier to create, configure, and manage models and related resources during test execution. - Added
make_pipelinefixture. A new fixture namedmake_pipelinehas been added to the project, which facilitates the creation and cleanup of a Delta Live Tables Pipeline after testing. This fixture is added to thecompute.pyfile and takes optional keyword arguments such asname,libraries, andclusters. It generates a random name, creates a disposable notebook with random libraries, and creates a single node cluster with 16GB memory and local disk if these arguments are not provided. The fixture returns a function to create pipelines, resulting in aCreatePipelineResponseinstance. Additionally, a new integration test has been added to test the functionality of this fixture, and it logs information about the created pipeline for debugging and inspection purposes. T...