In recent years, there has been an explosion in object data that lie on curved sample spaces. While statistical research on directional and low-dimensional shape data began with parametric procedures, the development of nonparametric theory on manifolds largely displaced parametric methods. Recently, though, researchers have been increasingly utilizing probability distributions to (a) perform inference when the nonparametric procedures are not suitable, (b) develop more sophisticated Bayesian approaches than available previously, and (c) perform probabilistic learning procedures. Despite this, because classical goodness-of-fit tests rely primarily on cumulative distribution functions that are not well-defined on many object data sample spaces, no general methods existed for checking the distributional assumptions required for these procedures. In this talk, we present methods utilizing the geometry of the sample space to assess goodness-of-fit for object data using nearest neighbor graphs and the concept of energy distance. We will compare these approaches through both theoretical results and simulation studies and illustrate their use with various forms of object data.