Doubly intractable distributions arise in many settings, for example in Markov models for point processes and exponential random graph models for networks. Bayesian inference for these models is challenging because they involve intractable normalizing "constants" that are actually functions of the parameters of interest. Although several computational methods have been developed for these models, each method suffers from computational issues that make it computationally burdensome or even unfeasible for many problems. We propose a novel algorithm that provides computational gains over existing methods by replacing Monte Carlo approximations to the normalizing function with a Gaussian process-based approximation. We provide theoretical justification for this method. For the class of models to which the algorithm is applicable, our algorithm shows dramatic gains in computational efficiency over existing methods, between 6 and 120 times faster for our examples. We illustrate the application of our methods to simulated data as well as to real data examples illustrating an exponential random graph model and a Markov point process.