The Effect of a Student-Designed Data Collection Project on Attitudes Toward Statistics

Lisa J. Carnell
High Point University

Journal of Statistics Education Volume 16, Number 1 (2008), jse.amstat.org/v16n1/carnell.html

Copyright © 2008 by Lisa J. Carnell all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.

Key Words: Attitudinal scale; Statistics education; Project-based learning; Comparative study.

Abstract

Students often enter an introductory statistics class with less than positive attitudes about the subject. They tend to believe statistics is difficult and irrelevant to their lives. Observational evidence from previous studies suggests including projects in a statistics course may enhance students’ attitudes toward statistics. This study examines the relationship between inclusion of a student-designed data collection project in an introductory statistics course and 6 components comprising students’ attitudes toward statistics. The sample consisted of 42 college students enrolled in an introductory statistics course. Comparisons of those who completed the student-designed data collection project (n = 24) and those who did not complete the project (n = 18) suggest that inclusion of a project may not significantly impact students’ attitudes toward statistics. However, these findings must be viewed as only a preliminary step in the study of the effect of projects on attitudes toward statistics.

1. Introduction

For many students, the prospect of taking an introductory statistics class is daunting. Kirk (2002) reported that students believe an introductory statistics course to be demanding, to involve lots of math, and to be irrelevant to their career goals. Many students who complete an introductory statistics course often have negative perceptions of the course and are dissatisfied with the experience (Garfield 1997). In addition, many nonstatistics majors in algebra-based introductory statistics courses suffer from statistics anxiety (Bradstreet 1996). These attitudes are problematic given that some colleges and universities are now using an introductory statistics course as a primary way to meet the general education requirement for mathematics.

What are the desired outcomes in an introductory statistics course? Garfield, Hogg, Schau, and Whittinghill (2002) reported that these outcomes fell into three categories oriented around students’ learning, students’ persistence, and students’ attitudes and beliefs. While learning outcomes are the ones most generally considered, "the other outcomes … are also important to consider as they will greatly affect whether or not our students are able to appropriately use statistical skills, ideas, and techniques. Therefore, our courses should attempt to build strong positive attitudes toward statistics and reinforce students’ use of statistics in the real world to increase their chances of using statistics after they leave our courses" (Examining Student Outcomes, ¶ 1).

Snee (1993) advocated changes in the instructional delivery system for statistics education citing people’s lack of understanding of statistical thinking which resulted in a lack of appreciation for statistical thinking. Snee suggested that experiential learning, which includes working with real data and having students work on subject matter in which they take a personal interest, are vital to improving students’ attitudes toward statistics.

Hogg (1991) argued that in introductory statistics courses there is not enough emphasis on teamwork and that students should generate their own data for analysis. He stated that having students participate in the entire statistical process from choosing the question, generating the data and its analysis, to communicating findings was beneficial to students.

Others have also reported the efficacy of using projects to enhance attitude and achievement in statistics. Garfield (1993) reported that using cooperative learning activities to help students learn may enhance motivation and improve student attitudes toward the subject being studied. Potthast (1999) reported that using a series of small-group cooperative learning experiences increased students’ scores on tests as compared to a group not using the cooperative learning format. Smith (1998) reported that using team projects in an introductory statistics class produced survey information from students suggesting positive student attitudes and an increased sense of relevance for the course.

Overall, the literature suggests that using projects of some type in an introductory statistics class may positively influence learning of statistics (Shaughnessy 1977; Garfield 1995; Bradstreet 1996; Moore 1997). However, often the projects that students in these studies were asked to do were designed by the instructor in some way. Usually, the instructor came up with at least the topic for the project. Smith(1998) did allow students, with prior instructor approval, to modify or replace a given project. However, he reported that students seldom asked to make these changes.

In addition, much of the literature which discusses student attitudes toward statistics when an activity-based approach is used provides information about students’ attitudes toward statistics based on student survey comments rather than evidence of attitude change based on experimental results (Smith 1998; Kirk 2002). Potthast (1999) did include information from the subscales of the Survey of Attitudes Toward Statistics, SATS©, (Schau, Stevens, Dauphinee, and Del Vecchio 1995) in comparing her two groups, but the purpose of using the scale was to explain the difference in attitudes toward statistics between the two groups prior to the implementation of cooperative learning activities rather than to look at differences in attitude at the conclusion of the cooperative learning activities. Harlow, Burkholder, and Morrow (2002) used empirical methods to determine the effect of multiple learning enhancements, including an applied project, on attitudes in a quantitative methods course. However, they reported that their results were limited by using only a single survey item to measure the effect of the applied project, thus compromising the reliability of the measure. In addition, it was not totally clear who chose the project topics, only that the topics were of interest to the students.

The aim of this research was to determine, using empirical methods, whether students’ attitudes toward statistics were improved if the students in an introductory statistics class designed and implemented their own projects based on topics of personal interest, as is suggested by the literature. In addition, if that positive effect were found, this study was designed to provide a first step in understanding which components of attitude toward statistics might be positively impacted by student-designed projects.

A quantitative study of the effect of a student-designed data collection project on students’ attitudes toward statistics was performed by using the instrument Survey of Attitudes Toward Statistics – 36© (SATS-36), an update of Survey of Attitudes Toward Statistics – 28 formerly known as the SATS (Schau et al. 1995; Schau 2003b), to measure attitudes systematically. The results of this study may provide a better understanding of influences on student attitudes toward statistics and may enhance the teaching and learning of statistics.

2. Method

2.1 Participants

Students in two sections of a one semester introductory statistics course at a four-year liberal arts university in the southeast United States were the subjects in this study. The treatment group which completed the project contained 24 students, 14 females and 10 males. The control group which did not complete the project contained 18 students, 6 females and 12 males. Each section was composed of a mixture of freshmen, sophomores, juniors, and seniors representing a variety of majors. Table 1 provides a summary of study participants by gender and class year. The students self-selected into the two sections with no prior knowledge of whether or not a project would be included. Over the course of the semester, each section met for two 75-minute class meetings per week for 14 weeks. The treatment group met at 9:30 in the morning on Tuesdays and Thursdays. The control group met at 2:00 in the afternoon on Mondays and Wednesdays. Each section was taught by the same instructor.


Table 1. Summary of Study Participants By Gender and Class Year.

  Project No Project
Class Year Male Female Male Female
Freshman 1 0 2 0
Sophomore 4 9 1 2
Junior 5 2 4 3
Senior 0 3 5 1


2.2 Instrument

The Survey of Attitudes Toward Statistics - 36© (Schau 2003b) is an instrument designed to assess various dimensions composing attitudes toward statistics including value, difficulty, interest, affect, cognitive competence, and effort. The survey consists of six subscales (see Table 2) addressing components of students’ attitudes toward statistics. Respondents score on a 7-point Likert-type scale corresponding to their level of agreement with each statement (strongly disagree:1, neither agree or disagree: 4, strongly agree: 7). Scores for each subscale are determined by adding up the item values for the subscale and dividing by the number of items in that subscale. Some items are written in the affirmative, and some items are written in the negative. Reversals of negatively worded items are made before subscale scores are determined so that a higher numbered response always indicates a more positive attitude.

The Survey of Attitudes Toward Statistics - 36©, SATS - 36, is recent extension of an earlier version of the instrument which originally contained four subscales (value, difficulty, affect, and cognitive competence). Schau et al. (1995) described the development and validation of the original instrument. Subsequent validation research (Hilton, Schau, and Olsen 2004) indicated the four-factor structure adequately described responses to the original SATS instrument. In SATS – 36, two additional subscales were added, interest and effort. Tempelaar, Van Der Loeff, and Gijselaers (2007) have used confirmatory factor analysis to determine that the six-factor model fits the data well and that the two new subscales, interest and effort, are valuable additions to the original instrument.


Table 2. Survey of Attitudes Toward Statistics - 36© Subscales

SubscaleNo. of items Sample Items
Value 9 Statistical skills will make me more employable.
Difficulty 7 Statistics formulas are easy to understand.
Interest 4 I am interested in understanding statistical information.
Affect 6 I like statistics.
Cognitive Competence 6 I can learn statistics.
Effort 4I worked hard in my statistics course.

Note. Sample items are from Survey of Attitudes Toward Statistics – 36 by Candace Schau.
Available from CS Consultants, LLC website, www.evaluationandstatistics.com.
Copyright 2003 by Candace Schau. Reprinted with permission.


2.3 Procedure

The optimum experimental design would have students randomly assigned to each class section. In practical terms, however, one has already existing classes available for experimentation in educational research. Therefore, a quasi-experimental design was used. A pre- and posttest design was used to test for differences in attitudes after imposition of the treatment.

To determine if the students in the two sections of introductory statistics were similar with respect to attitudes toward statistics prior to the imposition of the treatment, on the first day of class, each student in each section completed the Survey of Attitudes Toward Statistics - 36© (Schau 2003b).

The survey was administered by an assistant to the instructor who then coded the data so that the responses of individual students would not be known to the instructor. After completion of the survey, the course requirements were discussed in each section of the class. The instructor was the same for both sections and made every effort to provide the same instruction, activities, assignments, and assessments to all students as part of the quasi-experimental design in order to keep the two groups’ classroom environments as alike as possible. However, one section completed a statistics project (the treatment) and the other section did not. Students did not know in advance that the two sections would have different requirements, and students in the two sections were not blocked from interacting with each other.

The project consisted of formulating a research question of interest to the student, designing and implementing a data collection strategy to answer the question, analyzing the collected data using techniques learned in the course, writing a report describing the findings, and making a brief oral report to the class on the results of the project. Students were allowed to work alone or in project teams of their own choosing up to size four. To ensure that everyone on a project team participated, students self-reported on their contributions to the project and on the contributions of other group members. The grade for the project counted as a test grade in the course. Students had approximately six weeks to complete the project.

The project progressed in two phases. In phase I, the students formed their project teams (if applicable), and each team decided on a question of interest. Each team then made a proposal to the instructor in which they explained the significance of their question, how they intended to gather data to answer it, and what possible problems they anticipated in completing the project. Appendix A contains the proposal form the students filled out. These proposals were evaluated by the instructor and discussed with the students. No data collection could begin until the proposal had been approved by the instructor. In phase II, the students collected and analyzed data to answer their questions, wrote reports based on guidelines given by the instructor, and made oral presentations to the class. Appendix B contains the instructions students were given for writing their reports. The instructor provided support and answered questions as needed throughout the process.

On the last day of class, each student in both sections once again completed the Survey of Attitudes Toward Statistics - 36©. The survey was administered by an assistant to the instructor who then coded the data so that the responses of individual students would not be known to the instructor.

3. Results

Table 3 shows means, standard deviations, and t-statistics for testing whether the two groups differed initially on any of the six subscales of the pre-treatment administration of SATS-36©. There were no statistically significant differences (p > 0.05) in mean responses for the two groups with respect to the six subscales. The two groups appeared to be sufficiently similar in attitudes toward statistics as measured by these subscales prior to the imposition of the treatment to allow the study to continue.

Cronbach’s alpha was calculated for the six components of the attitudinal scale. Table 4 compares the values from this study to values reported by Schau (2003a; personal communication, March 12, 2007) and Tempelaar et al. (2007). These alpha levels indicate good internal consistencies within attitude components and thus adequate internal reliability of the instrument. The consistency of these alpha levels with those reported in the literature also indicates external reliability.


Table 3. Mean Pre-treatment Responses on Attitude Subscales by Group

  Projecta No Projectb
Subscale M SD M SDt pc
Value 4.97 0.79 4.40 1.321.69 0.104
Difficulty 3.84 0.44 3.34 0.962.04 0.053
Interest 4.58 0.97 4.11 1.781.02 0.319
Affect 4.38 1.09 4.11 1.340.70 0.486
Cognitive Competence 5.19 0.83 4.84 1.510.89 0.381
Effort 5.84 1.02 5.64 1.200.60 0.554

Note: Responses were made using a 7-point scale (1 = strongly disagree,
4 = neither agree nor disagree, 7 = strongly agree).

an = 24
bn = 18
cTwo-tailed p-value for t-test of equality of means


Table 4. Comparison of Cronbach’s α by Subscale

SubscaleThis Study Schau, Tempelaar
Value 0.88 0.74 - 0.90
Difficulty 0.79 0.64 - 0.81
Interest 0.90 0.80 - 0.92
Affect 0.81 0.80 - 0.89
Cognitive Competence 0.85 0.77 - 0.88
Effort 0.790.76 - 0.91


Table 5 shows means, standard deviations, and t-statistics for testing whether the two groups differed on any of the six subscales of the post-treatment administration of SATS-36©. There were no statistically significant differences (p > 0.05) in mean responses for the two groups with respect to the six subscales. Two students in the treatment group did not complete the course. Their early departures were not related to the treatment.

Next for each subject, a difference score was computed (posttest – pretest) for each of the six subscales. Then a mean difference score was computed for the project group and the no project group for each of the six subscales. A positive mean difference score indicated an improvement on the attitudinal subscale from the beginning of the course to the end of the course. A negative mean difference score indicated a decline in attitude from the beginning of the course to the end of the course for that subscale component. An independent samples t-test was performed to test the equality of the mean differences for the project group and the no project group for each of the six subscales. Table 6 contains the means, standard deviations, and t-statistics for testing whether the mean difference scores for the two groups are equal. There were no statistically significant differences (p > 0.05) in mean difference scores for the two groups with respect to the six subscales.


Table 5. Mean Post-treatment Responses on Attitude Subscales by Group

  Projecta No Projectb  
Subscale M SD M SDt pc
Value 4.90 1.12 4.27 1.291.661 0.105
Difficulty 4.11 0.97 3.59 1.091.624 0.113
Interest 3.94 1.16 3.19 1.761.548 0.133
Affect 4.50 1.48 4.13 1.610.758 0.453
Cognitive Competence 5.04 1.41 5.20 1.47-0.364 0.718
Effort 5.14 1.66 5.19 1.38-0.119 0.906

Note: Responses were made using a 7-point scale (1 = strongly disagree,
4 = neither agree nor disagree, 7 = strongly agree).

an = 22
bn = 18
cTwo-tailed p-value for t-test of equality of means


To further explore the data, one-sample t-tests were performed to determine if the mean difference scores (posttest – pretest) were significantly different from zero for the six subscales when the project group and the no project group were considered separately. (Table 7) contains these test results. These results indicate that the project group had a significant decrease on the attitude subscales of interest and effort (p < .01). The no project group had a significant decrease on the attitude subscale of interest (p < .05). To explore whether the perception of difficulty of the course had changed over the semester for everyone in the study, a matched-pairs t-test was performed on the difficulty subscale (posttest – pretest) to determine if the mean difference was significantly different from zero. The results were not statistically significant (p > .09) indicating no change in the subjects’ perceptions of difficulty from the beginning of the study to the end. Since gender may affect attitudes about statistics, the pretests, posttests, and difference scores for the six subscales were analyzed using independent samples t-tests to determine if males and females demonstrated different attitudes toward statistics. There were no significant gender differences found on the six subscales (p > .07).

4. Discussion

At the beginning of this study, the two groups of students appeared to be comparable in attitudes toward statistics on the six subscales of SATS-36©. At the conclusion of the study, the group that did the student-designed data collection project did not exhibit more positive responses on the six subscales than the group that did not do the project. The data do not provide evidence that having students choose their own topic, then design and implement a project enhances students’ levels of individual interest in statistics, perceptions of the usefulness, worth, and relevance of statistics in their personal and professional lives, or lower the perception of the difficulty of the subject. These findings are not in agreement with (Snee’s (1993) hypothesis that having students work on subject matter in which they take a personal interest will improve attitudes toward statistics. These findings are also in disagreement with (Smith’s (1998) student survey results indicating that projects enhance students’ attitudes toward statistics and increase students’ perception of relevance of the course.


Table 6. Mean Difference Scores on Attitude Subscales by Group

  Projecta No Projectb  
Subscale M SD M SDt pc
Value -0.03 0.98 -0.12 1.050.31 0.381
Difficulty 0.29 0.83 0.26 1.160.10 0.459
Interest -0.68 1.14 -0.92 1.610.54 0.297
Affect 0.09 1.29 0.02 1.440.17 0.435
Cognitive Competence -0.17 1.18 0.53 1.75-1.50 0.071
Effort -0.76 1.22 -0.44 1.34-0.78 0.219

Note: Responses were made using a 7-point scale (1 = strongly disagree,
4 = neither agree nor disagree, 7 = strongly agree).

an = 22
bn = 18
cOne-tailed p-value for t-test of equality of means


Table 7. One-Sample Significance Test of Mean Difference Scores on Attitude Subscales by Group

  Projecta No Projectb
Subscale t pc t pc
Value -0.12 0.905 -0.50 0.625
Difficulty 1.56 0.133 0.90 0.380
Interest -2.81 0.010 -2.41 0.027
Affect 0.33 0.744 0.06 0.956
Cognitive Competence -0.67 0.512 1.28 0.216
Effort -2.94 0.008 -1.41 0.178

an = 22
bn = 18
cTwo-tailed p-value for t-test mean difference equal to zero


Students’ feelings about statistics (affect), their attitudes about their knowledge and skills when applied to statistics (cognitive competence), and the amount of effort the students expended to learn statistics (effort) also were not affected by inclusion of a project. One possible explanation is that the inclusion of a single project may not have been sufficient to impact global attitudes toward statistics as measured by the affect subscale. Also to positively impact attitudes about one’s cognitive competence as it pertains to statistics, it may take more than one experience with a project. Other factors such as test performance and previous experiences in quantitative classes could impact perception of cognitive competence and may outweigh any positive effects from a single project. With respect to the findings on the effort subscale, on the pre-test, students in both groups already indicated a fairly positive predisposition to expend effort to learn statistics so that whether or not a project was included in the class, they were prepared to work and that did not change over the course of the semester.

Interestingly, when each group’s scores were analyzed separately for the six subscales, each group showed a significant decrease on the interest subscale over the course of the study. The students had less individual interest in statistics at the end of the course than at the beginning whether they did the project or not. The change in interest in the project group is in the opposite direction of what the observational evidence suggests should happen when students do projects but is in line with negative perceptions that many students have upon completion of a statistics course. Perhaps to know statistics is not to love statistics, or perhaps the course turned out to be more difficult than the students thought. However, analysis of the difference in difficulty subscale scores indicated no difference in the perception of difficulty from the beginning of the course until the end. The project group also showed a significant decrease on the effort subscale over the course of the study. The decrease on the effort subscale may indicate that as students looked back over the effort expended during the semester while completing the posttest, they realized they did not put as much work into the class as they had anticipated at the beginning of the semester or that it was not necessary to put as much work into the course as originally planned. It should be noted that the interest and effort subscales are the newest addition to SATS-36©, and there is less validation research available for these subscales than for the other four.

The presence of confounding variables may have impacted the outcome of this study. Time of day of the class meeting may have influenced results. One class met in the morning while the other class met in the afternoon. The grouping of students may have affected student responses. Some students chose to work on their projects individually, and some students worked in groups. Also, the prior math classes that the students in each class had taken were not considered. Students with a more extensive mathematics background might feel differently about statistics than students with a more limited mathematics background. In addition, the two classes were not gender balanced. One class was approximately 58% female, and the other class was approximately 33% female. However, no statistically significant difference was found between responses for males and females on the pretest, posttest or in the difference scores for any of the six subscales. Also, since both classes were taught by the same instructor, there is the possibility of the introduction of unintentional bias.

Another concern about this study which must be addressed is the issue of sample size. The sample sizes of the control group (n = 18) and the treatment group (n = 22) at the end of the study were determined by the sizes of the two classes used in the study and could not be manipulated by the researcher. In the presence of negative findings, as was the case here, the question which must be addressed is whether the sample size was sufficiently large to detect a meaningful difference, if one existed, between the two groups. Post hoc power analysis indicated that there was at least 80% power to detect a 1 point difference between the project group and no project group on the attitude subscales of value and difficulty. There was at least 70% power to detect a 1 point difference between the project group and the no project group in the more variable subscales of interest, affect, and effort, and there was 65% power to detect a 1 point difference between groups on the cognitive competence subscale.

From reading descriptive accounts, one would expect a somewhat marked difference in the attitudes toward statistics of the two groups in this study. However, this empirical research did not find those differences. Perhaps students with the strongest positive reactions to classroom projects are more likely to share those responses with a teacher than those who have more neutral or negative responses. The relative anonymity of a survey which is administered to all students, not just the most vocal, may provide a more representative description of students’ attitudes.

Since this study is one of the first empirical attempts to quantify the impact of projects on students’ attitudes toward statistics, it raises many issues. Certainly it indicates that some caution must be used in assuming that the inclusion of any project will automatically result in improved attitudes. However, the fact that this one study did not find improvements in attitudes does not mean that projects do not enhance attitudes. More studies need to be done looking at different structures of classroom projects. There are many questions left to be explored in order to optimize whatever positive effects on attitude may emerge from using projects in the classroom. How many projects should be done? What is the optimum format for these projects to take? Is it better to have a series of projects all student-designed or should there be a mix of student-designed and instructor designed projects? Are students more directly impacted when working on group projects or individual projects? Should projects be long-term or short-term?

As educators debate the place of introductory statistics courses in the undergraduate curriculum, and some schools begin to include statistics as the primary course for satisfying the general education requirement for mathematics, it is important to understand what can be done to enhance students’ attitudes toward statistics. Further research is necessary to discern what, if any, positive impact projects may provide in order for students to see the utility of statistics outside the classroom and thus improve the perception of relevance of the content area.


Appendix A: Project Proposal

 

STS 220 Project Proposal

 

                           Proposal Due Date ____________                                           

                                       

                           Project Due Date__________

The purpose of this project is to give students the opportunity to conduct a statistical investigation in an area of interest. This project counts as a test grade. Students may begin data collection after the topic of the investigation has been approved by the instructor. This project may be done alone or in groups up to size 4. Only one proposal form per group will be turned in.

 

1.       List the names of everyone in your group. ________________________

 

2.       What is your question of interest? ____________________________________

 

3.       What is the significance of the study? (Why do it?)

 

4.       Will you be doing a survey or an experiment to answer your question? __________

 

If a survey, name your population and describe how you will select your sample.

List all the questions you will ask your subjects. Use extra paper if necessary.

 

If an experiment, name your population, describe how you will select your sample, describe your treatments, explanatory variable and response variable and how you will assign subjects to treatments. Use extra paper if necessary.

 

5.       How will you analyze your data when you get it? You will need some graphical analysis and some numerical analysis.

 

6.       What are some problems you might face in completing this assignment?

 

Notes:

1. All final projects must be word-processed and double-spaced. This proposal may be handwritten if done neatly.

2. Organization, grammar, spelling and overall neatness of presentation of final project will be considered and graded.

3. Grade deductions will be made for late projects.

4. This is a major project. Provide sufficient written documentation for what you do. Most students write too little rather than too much

5. You will be given instructions at a later date for the format of the final written project.

6. Each group will make a 5-minute oral presentation to the class of the results of their project on the project due date.

 

 


Appendix B: Project Write-up Instructions

STS 220

Project Write-up

 

Below are the instructions for preparing your final paper based on your statistics project. Please follow these guidelines carefully. Failure to do so may result in grade deductions.

 

Your paper should be organized as follows (but do not number the sections, instead use headings and paragraphs):

I.                     Title page

a.       Title of paper

b.       Names of people in group

II.                   Introduction (1 page)

a.       State your question

b.       Explain why knowing the answer to your question is important. Who would use the information and how?

III.                  Methodology (1-3 pages)

a.       What is your population?

b.       How did you choose your sample? Is it really random? If not, why not?

c.       How did you gather information from your sample? If you used a survey, include the survey questions. If you did an experiment, describe the experiment.

IV.                Results (1-3 pages)

a.       Give the results of your data analysis including graphs and numerical analysis within the body of the paper (not attached at the end).

b.       Identify any limitations that might contaminate results or lead to nongeneralizability of results. (i.e. nonrandom sample)

V.                  Conclusion (1 page)

Use your results to support the answer you give to your original question.

Notes:

1.       All projects must be word-processed and double-spaced.

2.       Grammar, spelling, syntax, and organization will be graded as well as the content of the paper.

3.       Most students write too little. Give lots of details in your paper. Have someone proofread your paper before you turn it in.

4.       Each group will make a 5-minute presentation to the class about the results of their project.

 

********************************************************************

5.       Project papers will be due on ___________

      There will be grade deductions for late work.

 

********************************************************************


Acknowledgements


I would like to thank the reviewers for their insightful comments. I would also like to thank Dr. Anita Bowman for her technical assistance with this project and Dr. Grace Kissling for her assistance in critiquing this manuscript.


References

Bradstreet, T. (1996), "Teaching Introductory Statistics Courses So That Nonstatisticians Experience Statistical Reasoning," American Statistician, 50(1), 69-78.

Garfield, J. (1993), "Teaching Statistics Using Small-Group Cooperative Learning", Journal of Statistics Education, [Online], 1(1). http://jse.amstat.org/v1n1/garfield.html

Garfield, J. (1995), "How Students Learn Statistics," International Statistical Review, 63 (1), 25-34.

Garfield, J. (1997), "Discussion" [Discussion of the journal article "New Pedagogy and New Content: The Case of Statistics"], International Statistical Review, 65(2), 137–141.

Garfield, J., Hogg, B., Schau, C., and Whittinghill, D. (2002), "First Courses in Statistical Science: The Status of Educational Reform Efforts," Journal of Statistics Education [Online], 10(2). http://jse.amstat.org/v10n2/garfield.html

Harlow, L., Burkholder, G., and Morrow, J. (2002), "Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach," Structural Equation Modeling,9(3), 413–430.

Hilton, S., Schau, C., and Olsen, J. (2004), "Survey of Attitudes Toward Statistics: Factor Structure Invariance by Gender and by Administration Time," Structural Equation Modeling, 11(1), 92-109.

Hogg, R. V. (1991), "Statistical Education: Improvements Are Badly Needed," The American Statistician, 45, 342-343.

Kirk, R. E. (2002), "Teaching Introductory Statistics: Some Things I Have Learned," paper presented at the Annual Conference of the American Psychological Association, Chicago, IL. (ERIC Document Reproduction Service No. ED 473 611)

Moore, D. (1997), "New Pedagogy and New Content: The Case of Statistics," International Statistical Review, 65(2), 123–165.

Potthast, M. J. (1999), "Outcomes of Using Small-Group Cooperative Learning Experiences in Introductory Statistics Courses," College Student Journal, 33(1), 34–42.

Schau, C., Stevens, J., Dauphinee, T. L., and Del Vecchio, A. (1995), "The Development and Validation of the Survey of Attitudes Toward Statistics,” Educational and Psychological Measurement, 55, 868-875.

Schau, C. (2003a), "Students’ Attitudes: The ‘Other’ Important Outcome in Statistics Education," Paper presented at The Joint Statistical Meetings, San Francisco, CA.

Schau, C. (2003b), Survey of Attitudes Toward Statistics - 36. Available from CS Consultants, LLC, www.evaluationandstatistics.com.

Shaughnessy, J. M. (1977), "Misconceptions of Probability: An Experiment with Small-Group Activity-Based Model Building Approach to Introductory Probability at the College Level," Educational Studies in Mathematics, 8, 285-315.

Smith, G. (1998), "Learning Statistics By Doing Statistics," Journal of Statistics Education, [Online], 6(3). http://jse.amstat.org/v6n3/smith.html

Snee, R. (1993), "What’s Missing in Statistical Education," American Statistician, 47(2), 149-154.

Tempelaar, D., Van Der Loeff, S., and Gijselaers, W. (2007), "A Structural Equation Model Analyzing the Relationship of Students’ Attitudes Toward Statistics, Prior Reasoning Abilities and Course Performance", Unpublished manuscript, Maastricht University, The Netherlands.


Lisa J. Carnell
Department of Mathematics and Computer Science
High Point University
High Point, NC 27262
U.S.A.
lcarnell@highpoint.edu

Volume 16 (2008) | Archive | Index | Data Archive | Resources | Editorial Board | Guidelines for Authors | Guidelines for Data Contributors | Home Page | Contact JSE | ASA Publications