-
Notifications
You must be signed in to change notification settings - Fork 793
Open
Description
By default the MetaLearners are .predict() withing .effect() (through const_marginal_effect()). This is appropriate for regressors, but for classification, shouldn't it it be using .predict_proba() to compute the taus? Otherwise the taus are only {-1,0,1} using a base threshold of 0.5 within .predict().
def const_marginal_effect(self, X):
"""Calculate the constant marignal treatment effect on a vector of features for each sample.
Parameters
----------
X : matrix, shape (m × d_x)
Matrix of features for each sample.
Returns
-------
τ_hat : matrix, shape (m, d_y, d_t)
Constant marginal CATE of each treatment on each outcome for each sample X[i].
Note that when Y is a vector rather than a 2-dimensional array,
the corresponding singleton dimensions in the output will be collapsed
"""
# Check inputs
if 'X' in self._gen_allowed_missing_vars():
force_all_finite = 'allow-nan'
else:
force_all_finite = False
X = check_array(X, force_all_finite=force_all_finite)
taus = []
for ind in range(self._d_t[0]):
taus.append(self.models[ind + 1].predict(X) - self.models[0].predict(X))
taus = np.column_stack(taus).reshape((-1,) + self._d_t + self._d_y) # shape as of m*d_t*d_y
if self._d_y:
taus = transpose(taus, (0, 2, 1)) # shape as of m*d_y*d_t
return taus
Metadata
Metadata
Assignees
Labels
No labels