Abstract:
|
Greedy search (GS) or exhaustive search plays a critical role in recursive partitioning and their various extensions. We examine an alternative method, termed smooth sigmoid surrogate (SSS), that approximates the indicator threshold function involved in recursive partitioning with a smooth sigmoid function. In many scenarios, the discrete greedy search for the best cutoff point can then be reformulated into a one-dimensional smooth optimization problem after approximation. The proposed method can dramatically reduce the computational cost with GS. Moreover, it is more effective in detecting weaker signals and less prone to the inherent end-cut preference problem. In addition, SSS helps address the variable selection bias problem by referring to a parametric nonlinear model. Extensive simulation studies and real data examples are provided to evaluate and illustrate its usage.
|