National University of Singapore
Journal of Statistics Education Volume 15, Number 2 (2007), http://ww2.amstat.org/publications/jse/v15n2/chu.html
Copyright © 2007 by Singfat Chu all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.
Key Words: ARIMA, Logistic Regression, Pedagogy
Recently, Lovallo and (2002 Nobel laureate in Economics) Kahneman (2003) highlight the preponderance of over-optimistic forecasts made by executives due to cognitive biases and to organizational pressures. They recommend that more realistic forecasts could and should be produced by supplementing “inside view” forecasts with “outside view” or statistical forecasts.
The marketplace demand for graduates with analytical skills provides further impetus for exposing students to the salient features of statistical forecasting models that are of pertinent interest to the business world, namely the calibration of trends, seasonal indices and extraneous or “causal” effects. Many user-friendly software packages are available nowadays to facilitate such analyses. Trade publications, such as the MS/OR magazine, regularly report the state of the art in forecasting packages e.g. Yurkiewicz (2003).
These market opportunities and software availability motivate us to equip our students, who will be future managers and entrepreneurs, with sound business forecasting skills. The purpose of this paper is to report on a dedicated business forecasting course that was added to my institution’s undergraduate curriculum in August 2002 and offered to some 60 students each semester.
The paper is organized as follows. Section 2 surveys the literature on forecasting education and the recommended pedagogy framed throughout the “Making Statistics More Effective in Schools of Business” (MSMESB) conferences. This paves the way for Section 3, which drafts the philosophy underlying the course design. Section 4 focuses on the initiatives that were implemented in the business forecasting course. The paper concludes with a discussion on the experiences of the instructor and the students in this course.
Such pedagogical shortcomings were among several which prompted the organization of several “Making Statistics More Effective in Schools of Business” (MSMESB) conferences beginning in 1986. Love and Hildebrand (2002) succinctly summarize the discussions at these conferences in a set of 10 best pedagogical practices:
In line with the first MSMESB recommendation “Practice What We Preach: Remember Our Customers”, I made it a point to focus applications, case studies and projects pertinent to the business world. For instance, I once provided a consultancy dataset (transformed for confidentiality purposes) as part of a case study to forecast the spot price of a commodity using historical shipment and inventory statistics. The efficacy of cases and projects towards learning is well documented in the literature (e.g. Love (1998, 2000) and Roback (2003)). In a business environment, they integrate:
A thorough treatment of regression (including indicator variables, interaction, model selection, residual analysis, data transformation and interpretation of parameters) and time series models (e.g. moving average, exponential smoothing, ARIMA) constitutes the core of a forecasting course. My students had basic exposure to regression and smoothing concepts in their Statistics and Operations Management courses. Nevertheless, I started with a refresher on simple linear regression and simple exponential smoothing before extending the ideas to multiple regression and Winters’ exponential smoothing. ARIMA models would be new to them. In the next section, I provide details on how I motivated ARIMA models as well as a new addition to the syllabus, namely Logistic Regression. The latter would be a valuable forecasting tool for Marketing majors as it would help them investigate factors influencing what leads to consumer choices.
Teamwork was the device used to implement the ensemble of MSMESB recommendations 3 to 10. Each semester, I split each of my 2 sections into 8 teams each comprising 3 – 4 students. Each team was responsible for (a) one topic presentation (b) 3 case studies assigned as homework and (c) a capstone project. These activities would address the business realism and communication shortcomings noted by Hanke (1994). The topic presentation was an innovation. It is described in detail in the next section.
The final 2 class meetings were dedicated to the presentation of the capstone projects. Each team was allocated 40 minutes for the presentation followed by a Q&A session. In each meeting, 3 teams were constrained to highlight a different pillar of the course namely, regression, time series and logistic regression models while the fourth and last team was free to showcase any forecasting tool. Teams selected their areas on a first-come-first-served basis 2 weeks before presentation. Constraining the projects to the 3 pillars of the course allowed the students to refresh ideas, clear out doubts in the Q&A sessions and thereby prepare effectively for the final exam. Each team either collected primary data or downloaded data from the Internet, organized and analyzed the business related data, interpreted the findings and submitted recommendations to Management in simple, non-technical language.
Examples of project presentations include regression models to explain salaries in the Logistics industry, the prices of hotel rooms and airplane e-tickets; time series models to investigate trends and seasonalities in container throughput at the world’s busiest port (Singapore), water consumption, tourist arrivals, and postal deliveries (interestingly, the students have inferred that email has hardly dented the volume of snail mail); logistic regression models to explain the adoption of a service provider (e.g. credit card) or the choice of a particular brand (e.g. cellular phone, beverage etc.). The quality of the projects has risen from semester to semester as teams tried to do better than past projects that I posted on the course website.
The novelty in the team activities resided in the topic presentation. For this, each team searched for materials over the Internet and pertinent journals available in our digital library (Harvard Business Review, Interfaces, Journal of Business Forecasting, MS/OR etc.) and presented them for about 30 minutes at the beginning of each class. I chose the topics, allocated them randomly to the teams on Day 1 of the course and told them to discover and present the pertinent information without any assistance on my part. The topics covered subjective and quantitative forecasting tools and the latest industry practices e.g. Delphi Forecasting Method, Bass Diffusion of Innovations Model, Technical Analysis, Neural Networks, Collaborative Forecasting, etc.
This active learning exercise generated 3 benefits: sharing in class leadership, students being held accountable for their learning and diversity in the activities during the weekly 3 hour long class. The teams knew that their presentations had to be original, as I had archived past presentations on my computer. Also, I declared the topics as examinable on the final exam. Accordingly, the students researched their topics thoroughly. The non-presenting students also had an incentive to ask questions to clear their doubts. Teams instinctively prepare well when their grades depend substantially (I awarded 30% of the grade to Q&A performance) on how they answer a holistic range of questions from their fellow peers and the instructor.
I insisted that all the team members share equally in the presentation with each speaking for about 5 minutes. This was ample time to gauge the level of participation of each speaker in the preparation and coordination of the presentation. In the better-organized teams, I noticed that the members tended to anticipate questions and would handle them collectively rather than on an individual basis.
As I had 2 sections and hence 2 presentations on the same topic, I posted both on the course webpage to further promote collaborative learning across the sections. Personally, the presentations allowed me to infer how students wanted materials to be presented to them. For example, I noticed that teams put in substantial effort to find local, timely or familiar applications to motivate the utility of a forecasting tool prior to detailing its concepts. They rehearsed their presentations and managed to fit them into the allocated time, something that lecturers do not always succeed at.
Demonstrating that models already discussed in the course were actually special cases of the ARIMA model also helped in its motivation. For instance, Hanke and Wichern (2005, p 431, Q7) supply a dataset and a guided exercise to help the students infer that a linear trend model is actually a special case of the ARIMA (0,1,1) model. This follows from,
Hanke and Wichern (2005, p 424) further demonstrate that another special case of the ARIMA (0,1,1) model is the simple exponential smoothing model. Students may be further tasked to establish how the quadratic trend model or Winters’ Exponential Smoothing Model relates to higher order ARIMA models.
Going through the above helped the introduction of a challenging forecasting tool and made the students realize the utility of ARIMA models to model situations beyond first order serial dependence. I was particularly delighted when two teams confidently and satisfactorily used ARIMA models for their capstone project in the first semester of academic year 2003-4.
I intentionally carried out a linear regression on binary response data to discuss its shortcomings and thereby motivate the need for an S-curve logistic model. I mentioned the concepts underlying maximum likelihood estimation without getting into the details. More importantly, I stressed the interpretation of odds (i.e. ratio of success to failure probabilities) and log odds output by computer software. Students were reminded to select their explanatory variables carefully because logistic regression models could also be afflicted by such problems as multicollinearity, and serial correlation.
I found many timely forecasting applications in the media and on the Internet. For example, my students appreciated newspaper reports on the purchase of weather forecasts towards such activities as inventory planning at Wal-Mart and the trading of commodity futures. Prior to the Athens 2004 Olympics, several forecasts of medal tallies appeared on the Internet (e.g. search “Regression Olympic Medal” on Google). The “fun” regression models posted there provided interesting material for class discussion. Another prediction that interested my students concerned the Nov 2004 US presidential elections.
When the SARS (Severe Acute Respiratory Syndrome) pandemic affected many Asian countries in 2003, the developers of a credit-card size frontal digital thermometer which could flash frontal temperatures within 15 seconds (compared to 1 to 2 minutes for analogue thermometers) used linear regression to benchmark its measurements against the standards which were based on oral temperatures.
Consultancy reports on 3G mobile phone adoption and real estate valuation offered the students interesting windows into professional forecasting practices.
My relatively small sections facilitated the pedagogy. Larger classes present harder-to-resolve challenges with ever limited time, people and resources. For instance, students expect timely feedback, say 1 week after case submission. As the teams approached the problems from different angles or used different portions of the dataset, I had to understand their rationale, provide constructive feedback on where they went right and wrong and benchmark their efforts and their results transparently relative to their peers.
Since its first offering in 2002, the course has attracted an average rating between 4.2 and 4.4 out of 5 on the students’ evaluation of 7 equally weighted aspects of pedagogy. The ratings pertain to the enhancement in their thinking skills, the timeliness and usefulness of feedback, the approachability of the instructor, the impetus on their research, the increase in subject interest, the degree of business realism and the integration of the subject within the entire curriculum. Tellingly, the average course ratings have been about 10%-15% higher than the averages for the other business courses offered at the same level. Furthermore, 40% or more of the students nominated the course for innovative teaching each semester. These contributed in part to the 2004 Faculty Outstanding Educator Award to the author.
The COE scheme started in August 1990. It uses a uniform price mechanism to allocate rights to put new vehicles on the road. It works as follows: suppose the policy makers decide that 1000 new vehicles can be put on the roads each month. The public is invited to bid for these rights. At the end of the auction, the 1000th highest bid is deemed as the clearing price. This means that the top 1000 bidders will be allocated a right to vehicular ownership and they will all pay a uniform price equal to the last (or 1000th) successful bid.
From August 1990 to June 2001, the COE auction was held once a month and it employed a closed-bid format (i.e. bids were secret and only disclosed at the close of the auction). Between July 2001 and March 2002, the auction was held twice a month alternating between closed-bid and open-bid formats. Under the open-bid format, a bidder could check the current clearing price and change his/her own bid on a dedicated website during the auction window which spanned 3 days. From April 2003 onwards, the biweekly auction has followed the open-bid format exclusively.
In the dataset (COE.xls), I provide data on the current and lagged clearing prices, the number of bidders and the quota of rights for each auction. I also provide a dummy variable to indicate closed or open bid formats. The dataset can be employed to illustrate simple as well as more intricate forecasting models. For example, students can consider the suitability of the Winters Exponential Smoothing model or the ARIMA model for the exclusively closed or open bid sub-periods (i.e. August 1990-June 2001 or from April 2003 onwards).
Figure 1: COE clearing prices from August 1990 to Jan 2007
***: June 2001 to Mar 2002: Alternating biweekly closed and open bid formats
Figure 2: # Bids and Quota for each COE auction
Analyzing the entire time series is more intricate due to the change in auction format from closed to open bid. A suitable approach is to employ a regression model with explanatory variables such as preceding price, number of bidders, quota available, an auction format dummy and interaction terms to infer changes in bidding behaviour (i.e. are the regression coefficients for number of bidders and quota the same for closed and open bid auction formats?).
Residual analysis reveals 2 problems namely, (1) fanning-out and (2) first-order autocorrelation. Students will have to be guided through remedial actions such as log-transformations and the Cochran-Orcutt model. Interestingly, the conclusion from this analyis is that the change in auction format from closed to open bid did not affect the sensitivity of the COE price to either the preceding price, the number of bids and the quota. These findings provide rich discussion materials for the students as they integrate for example concepts such as price, supply and demand and differences in auction mechanism they may have come across in Economics or Strategy courses.
Albright, S. C., Winston, W. L. and Zappe, C. J. (2002), “Managerial Statistics”, Duxbury, California, USA.
Duran, J. A., and Flores, B. E. (1998), “Forecasting Practices in Mexican Companies,” Interfaces, 28, 56-62.
Hanke, J. E. (1984), “Forecasting in Business Schools: a Survey,” Journal of Forecasting, 3, 229-234.
Hanke, J. E. (1989), “Forecasting in Business Schools: a Follow-Up Survey,” International Journal of Forecasting, 5, 259-262.
Hanke, J. E., and Weigand, P. (1994), “What are Business Schools Doing to Educate forecasters,” Journal of Business Forecasting Methods & Systems, 13(3), 10-12.
Hanke, J. E., and Wichern, D. W. (2005), “Business Forecasting”, 8th edition, Pearson Prentice-Hall, New Jersey
Hays, J. M. (2003), “Forecasting Computer Usage,” Journal of Statistical Education [Online], 11(1). ww2.amstat.org/publications/jse/v11n1/datasets.hays.html
Klassen, R. D., and Flores, B. E. (2001), “Forecasting Practices of Canadian Firms: Survey Results and Comparisons,” International Journal of Production Economics, 70, 163-174.
Lovallo, D., and Kahneman, D. (2003), “Delusions of Success: How Optimism Undermines Executives’ Decisions,” Harvard Business Review, 81(7), 56-63.
Love, T. E. (1998), “A Project-Driven Second Course,” Journal of Statistics Education [Online], 6(1). ww2.amstat.org/publications/jse/v6n1/love.html
Love, T. E. (2000), “A Different Approach to Project Assessment,” Journal of Statistics Education [Online], 8(1). ww2.amstat.org/publications/jse/secure/v8n1/love.cfm
Love, T. E., and Hildebrand, D. K. (2002), “Statistics Education and the Making of Statistics More Effective in Schools of Business Conferences”, The American Statistician, 56(2), 107-112.
Mady, M. T. (2000), “Sales Forecasting Practices of Egyptian Public Enterprises: Survey Evidence,” International Journal of Forecasting, 16, 359-368.
Roback, P. J. (2003), “Teaching an Advanced Methods Course to a Mixed Audience,” Journal of Statistics Education [Online], 11(2). ww2.amstat.org/publications/jse/v11n2/roback.html
Sanders, N. R., and Manrodt, K. B. (2003a), “The Efficacy of using Judgmental versus Quantitative Forecasting Methods in Practice Forecasting,” OMEGA, 31, 511-522.
Sanders, N. R., and Manrodt, K. B. (2003b), “Forecasting Software in Practice: Use, Satisfaction and Performance,” Interfaces, 33 (5), 90-93.
Sanders, N. R., and Manrodt, K. B. (1994), “Forecasting Practices in United States Corporations – Survey Results,” Interfaces, 24, 92-100.
Winklhofer, H., and Diamantopoulos, A. (2002), “A Comparison of Export Sales Forecasting Practices among UK firms,” Industrial Marketing Management, 31, 479-490.
Yurkiewicz, J. (2003) “Forecasting Software Survey: Predicting Which Product is Right for You,” MS/OR Today, http://www.lionhrtpub.com/orms/orms-2-03/frsurvey.html
NUS Business School
National University of Singapore
Volume 15 (2007) | Archive | Index | Data Archive | Information Service | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications