Skip to content

Conversation

@jsquaredosquared
Copy link

@jsquaredosquared jsquaredosquared commented Dec 25, 2025

feat: add ability to use full compile and fit APIs

Reference Issues/PRs

Partially addresses this issue.

What does this implement/fix? Explain your changes.

For purposes like using validation, it is helpful to be able to use the full Keras compile and fit APIs from the deep learning estimators in Aeon.

This pull requests modifies the constructors and invocations to the compile and fit methods so that all the arguments from the Keras API can be used.

Does your contribution introduce a new dependency? If yes, which one?

No new dependency.

Any other comments?

Extending this to the clusterers and forecasters would require some additional work.

PR checklist

For all contributions
  • I've added myself to the list of contributors. Alternatively, you can use the @all-contributors bot to do this for you after the PR has been merged.
  • The PR title starts with either [ENH], [MNT], [DOC], [BUG], [REF], [DEP] or [GOV] indicating whether the PR topic is related to enhancement, maintenance, documentation, bugs, refactoring, deprecation or governance.
For new estimators and functions
  • I've added the estimator/function to the online API documentation.
  • (OPTIONAL) I've added myself as a __maintainer__ at the top of relevant files and want to be contacted regarding its maintenance. Unmaintained files may be removed. This is for the full file, and you should not add yourself if you are just making minor changes or do not want to help maintain its contents.
For developers with write access
  • (OPTIONAL) I've updated aeon's CODEOWNERS to receive notifications about future changes to these files.

feat: add ability to use full compile and fit APIs (resnet classifier)
@aeon-actions-bot aeon-actions-bot bot added classification Classification package enhancement New feature, improvement request or other non-bug code enhancement regression Regression package labels Dec 25, 2025
@aeon-actions-bot
Copy link
Contributor

Thank you for contributing to aeon

I have added the following labels to this PR based on the title: [ enhancement ].
I have added the following labels to this PR based on the changes made: [ classification, regression ]. Feel free to change these if they do not properly represent the PR.

The Checks tab will show the status of our automated tests. You can click on individual test runs in the tab or "Details" in the panel below to see more information if there is a failure.

If our pre-commit code quality check fails, any trivial fixes will automatically be pushed to your PR unless it is a draft.

Don't hesitate to ask questions on the aeon Slack channel if you have any.

PR CI actions

These checkboxes will add labels to enable/disable CI functionality for this PR. This may not take effect immediately, and a new commit may be required to run the new configuration.

  • Run pre-commit checks for all files
  • Run mypy typecheck tests
  • Run all pytest tests and configurations
  • Run all notebook example tests
  • Run numba-disabled codecov tests
  • Stop automatic pre-commit fixes (always disabled for drafts)
  • Disable numba cache loading
  • Regenerate expected results for testing
  • Push an empty commit to re-run CI checks

@jsquaredosquared jsquaredosquared changed the title [ENH]: extend dl classfiers/regressors to use full keras compile/fit API [ENH] extend dl classfiers/regressors to use full keras compile/fit API Dec 25, 2025
@satwiksps
Copy link
Contributor

Hi @jsquaredosquared, great work on this!

I checked the CI logs, and the failures are due to two main issues:

  1. Mutation Error: scikit-learn forbids modifying parameters passed to __init__ (like self.compile_args) inside the fit method. The line self.compile_args = {} ... changes the object's state, which causes the check_fit_updates_state_and_cloning test to fail.
  2. TypeError: Since compile_args defaults to None, unpacking it via **self.compile_args raises a TypeError.

And I think instead of fixing this in every single subclass (which creates a lot of code duplication), would it be a good idea in moving the logic into a helper method in BaseDeepClassifier and BaseDeepRegressor. This fixes the mutation bug by using local variables and keeps the codebase clean.

Proposed changes for aeon/classification/deep_learning/base.py (and similarly for Regressor):

1. Update __init__:

@abstractmethod
def __init__(self, ..., compile_args=None, fit_args=None):
   # ...
   self.compile_args = compile_args
   self.fit_args = fit_args
   # ...

2. Add the safe helper method:

def _fit_keras_model(self,model,X,y,batch_size,epochs,verbose,callbacks):
        c_args=self.compile_args if self.compile_args is not None else {}
        # Validation for compile_args...
        # ...
        model.compile(...,**c_args)
        # Use local variable for fit_args
        f_args=self.fit_args if self.fit_args is not None else {}
        # Validation for fit_args...
        # ...
        self.history=model.fit(
             X, y, 
             batch_size=batch_size, 
             epochs=epochs, 
             verbose=verbose, 
             callbacks=callbacks, 
             **f_args
        )
        return self.history

Then, you can simply call self._fit_keras_model(...) in the _fit method of the subclasses. This should clear up the CI errors!

@jsquaredosquared
Copy link
Author

jsquaredosquared commented Dec 25, 2025

Thanks for taking a look.

There is a lot of duplication in the model building and fitting. Perhaps a refactor is in order. For now I will just copy and paste because a full-on refactor seems daunting and might need a PR of its own 🥲 .

@jsquaredosquared jsquaredosquared changed the title [ENH] extend dl classfiers/regressors to use full keras compile/fit API [ENH] extend dl classifiers/regressors to use full keras compile/fit API Dec 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

classification Classification package enhancement New feature, improvement request or other non-bug code enhancement regression Regression package

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants