Abstract #301657

This is the preliminary program for the 2003 Joint Statistical Meetings in San Francisco, California. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 2-5, 2003); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2003 Program page



JSM 2003 Abstract #301657
Activity Number: 294
Type: Contributed
Date/Time: Tuesday, August 5, 2003 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract - #301657
Title: Neural Networks and Global Optimization
Author(s): Wade Brorsen*+ and Lonnie Hamm
Companies: Oklahoma State University and Oklahoma State University
Address: Dept. of Agricultural Economics, Stillwater, OK, 74078-6026,
Keywords: evolutionary algorithms ; neural networks ; stochastic global optimization ; simulated annealing
Abstract:

Training a neural network is a difficult optimization problem because of numerous local minimums. As an alternative to local search algorithms, many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, and therefore numerous random restarts with a local algorithm may be more effective than a global algorithm at obtaining a low value of the objective function. This study uses Monte Carlo simulations to determine the relative efficiency of a local search algorithm to 9 stochastic global algorithms: 2 simulated annealing algorithms, 1 simple random stochastic algorithm, 1 genetic algorithm and 5 evolutionary strategy algorithms. The computational requirements of the global algorithms are several times higher than the local algorithm, and there is little or no gain in using the global algorithms to train neural networks.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2003 program

JSM 2003 For information, contact meetings@amstat.org or phone (703) 684-1221. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2003