Abstract:
|
A new statistical challenge in the 21st century is how to deal coherently and effectively with inference using massive models often informed by massive data. Often these models are themselves networks of smaller component models each describing the stochastic development within different domains. So, for example, nuclear emergency response counter measures are currently being informed by probabilistic models of the nuclear plant from which a radiation leak may take place, a dispersal module describing how the radiation might spread geographically, and so on finally leading models describing the potential human health risks from exposure to these pathways. Assuming the user of this composite system is an expected utility maximizer it is possible to focus inference only on the variables that appear in this utility function. This dramatically reduces the dimensions on which inference happens. If in addition we are prepared to make other assumptions about the algebraic form of a user's utility function then we can further show that inference needs only to process the outputs of certain moments from the various modules, making such decision support feasible, transparent and fast.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.