In binary classification, sufficient dimension reduction (SDR) often suffers from the lower resolution of binary responses. For example, the sliced inverse regression can estimate at most one basis of the central subspace. In this article, a new class of SDR algorithm in binary classification is proposed based on weighted learning. Toward this, we establish that the gradient of the decision function is unbiased for SDR if the loss function of the classifier is Fisher consistent. This naturally leads us to develop a corresponding working matrix whose first few eigenvectors estimate the basis set of the central space for the binary response. The performance of the proposed method is evaluated by both simulated and real data examples.