Keywords: iterated filtering, pseudo proximal map, SMC, inference, network analysis
In big data era, we are often face with the challenges of high-dimensional data and complex models, for which the likelihood is either intractable or very expensive to compute. As a result, simulation-based inference has drawn much attention since it seems to be the only current solution to many real-world problems. Iterated filtering [10, 9] enables simulation-based inference via model perturbations and gradient approximation through sequential Monte Carlo filtering. Using iterated filtering as an approximation of the forward step of the proximal gradient, Guo  maximizes the likelihood function by iterating the pseudo-proximal map. In this paper, we improve on this novel idea by accelerating the process with additional momentum term. We show that under suitable perturbation policy, the proposed framework converges with an optimal rate for both convex and non-convex likelihood function. We demonstrate the efficiency of the algorithm based on a toy model and a challenging model of a biological network, showing substantial improvement over standard approaches.