Abstract/Details

Measurements of student understanding on complex scientific reasoning problems


2004 2004

Other formats: Order a copy

Abstract (summary)

While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning.

The overall results suggested test format differences. Factor analysis revealed three interpretable factors—m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures—VSAT, MSAT, high school grade point average, or final course grade—the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of further research and development in the area of assessment of scientific reasoning.

Indexing (details)


Subject
Educational evaluation;
Science education
Classification
0288: Educational evaluation
0714: Science education
Identifier / keyword
Education; Multiple-choice; Open-response; Scientific reasoning
Title
Measurements of student understanding on complex scientific reasoning problems
Author
Izumi, Alisa Sau-Lin
Number of pages
172
Publication year
2004
Degree date
2004
School code
0118
Source
DAI-A 65/01, Dissertation Abstracts International
Place of publication
Ann Arbor
Country of publication
United States
Advisor
Clement, John
University/institution
University of Massachusetts Amherst
University location
United States -- Massachusetts
Degree
Ed.D.
Source type
Dissertations & Theses
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
3118308
ProQuest document ID
305175355
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Document URL
http://search.proquest.com/docview/305175355
Access the complete full text

You can get the full text of this document if it is part of your institution's ProQuest subscription.

Try one of the following:

  • Connect to ProQuest through your library network and search for the document from there.
  • Request the document from your library.
  • Go to the ProQuest login page and enter a ProQuest or My Research username / password.