Skip to content

MetaLearners and Classification #1016

@MattDBailey

Description

@MattDBailey

By default the MetaLearners are .predict() withing .effect() (through const_marginal_effect()). This is appropriate for regressors, but for classification, shouldn't it it be using .predict_proba() to compute the taus? Otherwise the taus are only {-1,0,1} using a base threshold of 0.5 within .predict().

def const_marginal_effect(self, X):
    """Calculate the constant marignal treatment effect on a vector of features for each sample.

    Parameters
    ----------
    X : matrix, shape (m × d_x)
        Matrix of features for each sample.

    Returns
    -------
    τ_hat : matrix, shape (m, d_y, d_t)
        Constant marginal CATE of each treatment on each outcome for each sample X[i].
        Note that when Y is a vector rather than a 2-dimensional array,
        the corresponding singleton dimensions in the output will be collapsed
    """
    # Check inputs
    if 'X' in self._gen_allowed_missing_vars():
        force_all_finite = 'allow-nan'
    else:
        force_all_finite = False
    X = check_array(X, force_all_finite=force_all_finite)
    taus = []
    for ind in range(self._d_t[0]):
        taus.append(self.models[ind + 1].predict(X) - self.models[0].predict(X))
    taus = np.column_stack(taus).reshape((-1,) + self._d_t + self._d_y)  # shape as of m*d_t*d_y
    if self._d_y:
        taus = transpose(taus, (0, 2, 1))  # shape as of m*d_y*d_t
    return taus

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions