Skip to content

merge_ebm produces broken classifiers #576

Open
@DerWeh

Description

@DerWeh

The result of merge_ebm is not a valid classifier, as attributes are missing. The repr raises an AttributeError which makes it horrible to debug.

The following is a minimal reproducible example:

from sklearn.datasets import load_iris
from interpret.glassbox import ExplainableBoostingClassifier, merge_ebms

X, y = load_iris(return_X_y=True)

clf1 = ExplainableBoostingClassifier(interactions=0, outer_bags=2)
clf1.fit(X,y)
clf2 = ExplainableBoostingClassifier(interactions=0, outer_bags=2)
clf2.fit(X,y)
clf = merge_ebms([clf1, clf2])
repr(clf)

Which results in the error AttributeError: 'ExplainableBoostingClassifier' object has no attribute 'cyclic_progress'.

A hotfix is simply copying the attributes of the first classifier in the list:

for attr, val in clf1.get_params(deep=False).items():
    if not hasattr(clf, attr):
        setattr(clf, attr, val)

The question is, what the desired strategy is to fix this error. (I haven't delved into merge_ebms to know what's going on.) Another option would be setting the parameters where all classifiers agree, and setting all other parameters to None. This avoids immediate AttributeErrors but might cause problems later, in case None is not a valid parameter.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions