Self-normalization has been celebrated for its ability to avoid direct estimation of the nuisance asymptotic variance and its versatility in handling the mean and other quantities. However, the self-normalizer in its conventional form uses only recursive estimators of one direction, and thus may exhibit certain degrees of asymmetry. We consider a novel approach in generalizing the conventional self-normalizer to involve recursive estimators of both directions. For the generalized class of self-normalizers, we explore its time-symmetric subspace and a data-driven weight choice that corresponds to confidence intervals with minimal lengths. We study the asymptotic behavior of such a data-driven weight choice through the von Mises expansion with Gateaux derivatives, and find an interesting dichotomy between linear and nonlinear quantities. This is a joint work with Liliya Lavitas.