A Sample of Classroom Assessment Instruments Attitude Inventories

A Sample of Classroom Assessment Instruments
Attitude Inventories
Biology
Biology Self-Efficacy Instrument for Nonmajors (BSEIN). The Biology Self-Efficacy Instrument for
Nonmajors is a 23-item scale that asks students to rate on a five-point scale their level of confidence in three
key areas: 1) in writing and critiquing biological ideas through the use of laboratory reports; 2) in general
skills used in a typical biology course (e.g., analyzing data, asking meaningful questions); and 3) in ability to
apply biological concepts to every day life.
Baldwin, J.A., Ebert-May, D., & Burns, D.J. (1999). The development of a college biology self-efficacy
instrument for nonmajors. Science Education, 83, 397-408.
Biology Attitude Scale. The biology attitude scale investigates students attitudes toward biology with 14
Likert-like questions (five points, strongly agree to strongly disagree) and eight "semantic differential" scale
questions. An example of a Likert-like question used is: "Biology is fascinating and fun." An example of a
"semantic differential question used is: "Biology is Cruel A B C D E Kind" (students are asked to circle the
letter that reflects their view of where biology fits relative to the end points).
Russell, J. & Hollander, S. (1975). A biology attitude scale. The American Biology Teacher, 37 (5),
270-273.
Chemistry
Chemistry Attitudes and Experiences Questionnaire (CAEQ). The CAEQ is a closed-ended survey that asks
students to rate their attitudes about chemistry (22 items), their self-efficacy with respect to chemistry (20
items), and their learning experiences in their chemistry courses (35 items). The article below includes the
CAEQ and describes the instrument development and testing.
Coll, R.K., Dalgety, J., & Salter, D. (2002). The development of the Chemistry Attitudes and
Experiences Questionnaire (CAEQ). Chemistry Education: Research and Practice in Europe,
3(1),19-32.
Engineering
Pittsburgh Freshman Engineering Attitudes Survey (PFEAS). The Pittsburgh Freshman Engineering
Attitudes Survey (PFEAS) is a 50 item multiple-choice survey that gathers information about incoming
engineering students' attitudes in the 13 key areas.
URL: http://www.engr.pitt.edu/%7Eoutcomes/#
Besterfield-Sacre, M.E., Atman, C.J., & Shuman, L.J. (1998). Student Attitudes Assessment, Journal of
Engineering Education, 87(2), 133-141.
17
Mathematics
Questionnaire on attitudes about mathematics. This 81-item questionnaire (with 70 closed and 11 open
items) asks students to answer questions regarding elements which contribute to success/failure in
mathematics, perceptions of mathematics, english, and social science (as comparison), how mathematics is
taught, the nature of proofs, reasoning and constructions; motivation; and personal performance. There is no
discussion in the article of how the questionnaire was developed, and how it was tested for
validity/reliability.
Schoenfeld, A.H. (1989). Exploration of students' mathematical beliefs and behavior. Journal for
Research in Mathematics Education, 20 (4), 338-355.
Modified Fennema-Sherman Scale. Elizabeth Fennema and Julia A. Sherman constructed an attitude scale in
the early 1970's that capture students views on mathematics and views on gender and mathematics. . The
scale consists of four subscales (i.e., set of questions that is intended to reflect a concept, like confidence): a
confidence scale, a usefulness scale, a scale that measures mathematics as a male domain and a teacher
perception scale. Each of these scales consists of 12 items. Six of them measure a positive attitude and six
measure a negative attitude. The URL below provides the Fennema-Sherman Scale and an adaptation of the
scale to science.
URL: http://www.woodrow.org/teachers/math/gender/08scale.html
Fennema, E., & Sherman, J. (1978). Sex-related differences in mathematics achievement and related
factors: A further study. Journal for Research in Mathematics Education, 9(3), 189-203.
Physics
Maryland Physics Expectation (MPEX) Survey. The Maryland Physics Expectation (MPEX) survey has been
developed by the Maryland Physics Education Research Group (PERG) as part of a project to study the
attitudes, beliefs, and expectations of students that have an effect on what they learn in an introductory
calculus-based physics course. Students are asked to agree or disagree on a five-point scale with 34
statements about how they see physics and how they think about their work in their physics course.
URL: http://www.physics.umd.edu/rgroups/ripe/perg/expects/mpex.htm
Redish, E.F., Steinberg, R.N., & Saul, J.M. (1998). Student Expectations in Introductory Physics. Am. J.
Phys., 66, 212-224. <http://www.physics.umd.edu/rgroups/ripe/papers/expects/expects1.htm>
Science
Epistemological Beliefs Assessment for Physical Science (EBAPS) Instrument. The EBAPS is a forcedchoice instrument designed to assess students' epistemologies (i.e., their views about the nature of knowledge
and learning) in the physical sciences. EBAPS has 30 items and most students need 15-22 minutes to
complete the EBAPS.
URL: http://www2.physics.umd.edu/~elby/EBAPS/home.htm .
"The Idea behind EBAPS." 12 December 2001.
<http://www2.physics.umd.edu/~elby/EBAPS/idea.htm>.
Views About Sciences Survey (VASS). The Views About Science Survey (VASS) is designed to survey
student views about knowing and learning science, and to assess the relation of these views to student
understanding of science and course achievement (Grades 8-16).
URL: http://modeling.la.asu.edu/R&E/Research.html.
Halloun, Ibrahim. Views About Sciences Survey – VASS P20 Synopsis. 12 December 2001.
http://www.inco.com.lb/halloun/VASS20Snps.PDF
18
Views on Science-Technology-Society (VOSTS). The Views on Science-Technology-Society (VOSTS)
survey that investigates students' perceptions of the social nature of science and how science is conducted.
The VOSTS has 114 multiple-choice questions and the multiple-choice responses are grounded in the
literature and on an extended process of analyzing student comments, creating potential responses, and
verifying those comments with students. (The question in the figure below is illustrative of the kinds of
questions VOSTS asks.) The validity of the survey arises from its development, which is grounded in
student views (Aikenhead & Ryan, 1992).
URL: http://www.usask.ca/education/people/aikenhead/.
Aikenhead, G., & Ryan, A. (1992). The development of a new instrument: 'Views on ScienceTechnology-Society' (VOSTS). Science Education, 76, 477-492.
Scientific Attitude Inventory II. The SAI II asks students to rate their level of agreement on a five-point scale
to 40 statements about the nature of science and potential attitudes about science, both positive and negative.
An example statement is: "The laws and/or theories of science are approximations of truth and are subject to
change."
Moore, R.W. and Hill Foy, R.L. (1997). The scientific attitude inventory: A revision (SAI II).
Journal of Research in Science Teaching, 34(4), 327-336.
Modified Nature of Scientific Knowledge Scale (MNSKS). The Nature of Scientific Knowledge Scale
(NSKS) was revised to create the MNSKS scale. The MNSKS asked students to rate their level of
agreement on a five-point scale to 32 statements about four dimensions of nature of science (creativity of
science, development of science, testability of science, and unity of science). As an example, one
statement from the MNSKS is "A scientific theory and a work of art are similar in that they both
represent creative work of people." The MNSKS does not appear in the article.
Meichtry, Y. (1992). Influencing student understanding of the nature of science: Data from a case of
curriculum development. Journal of Research in Science Teaching, 29(4), 389-407
Views on the Nature of Science Questionnaire (Forms A-C). The VNOS-A, B, or C ask students write their
responses to a series of between 6 and 10 questions (depending upon the form). An example question is, "Is
there a difference between a scientific theory and a scientific law? Illustrate your answer with an example."
Abd-El-Khalick, F., Lederman, N.G., Bell, R.L., & Schwartz, R. (2001). Views of nature of science
questionnaire (VNOS): Toward valid and meaningful assessment of learners’ conceptions of nature
of science. Paper presented at the annual meeting of the Association for the Education of Teachers in
Science, Costa Mesa, CA
URL: http://www.flaguide.org/tools/diagnostic/views_of_nature_questionnaire.php
Statistics
Survey of Attitudes Toward Statistics (SATS). The Survey of Attitudes Toward Statistics (SATS) contains 28
seven-point Likert items designed to assess four components of students' attitudes toward statistics.
URL: http://www.unm.edu/~cschau/infopage.htm.
Schau, C., Stevens, J., Dauphinee, T., & Del Vecchio, A. (1995). The development and validation of the
Survey of Attitudes Toward Statistics. Educational & Psychological Measurement, 55 (5), 868-876.
Writing
Views about Writing Survey (VAWS). The VAWS is 26-item survey that asks students to rate on an eightpoint scale their attitudes and beliefs about writing and composition. The structure is based on the
"contrasting alternative design" developed for to measure students' attitudes towards physics (Halloun &
Hestenes, 1996). The paper that presents VAWS data does not describe testing or validation of the VAWS
instrument (Rhoads et al., 1998). The authors do indicate that they will conduct validation via focus group
interviews with students. The instrument was used to measure change in attitudes toward writing of students
19
enrolled in English class that was geared for students in an engineering learning community (50 students)
and of students in a traditional English class (155 students).
Rhoads, T.R., Duerden, S.J., & Garland, J. (1998) Views about Writing Survey - A New Writing
Attitudinal Survey Applied to Engineering Students. Paper presented at the Frontiers In Education
Conference, Tempe, AZ.
Halloun, I., & Hestenes, D. (1996). Views about Sciences Survey: VASS. Paper presented at the Annual
Meeting of the National Association for Research in Science Teaching, St. Louis, MO.
http://fie.engrng.pitt.edu/fie98/papers/1220.pdf
Classroom Environment
Learning Context Questionnaire. This 50-item questionnaire investigates students' level of agreement with
statements describing student's view of himself/herself and his/her education (e.g., "I enjoy courses most
when they are more theoretical.").
Griffith, J. V. & Chapman, D. W. (1982). LCQ: Learning context questionnaire. Davidson, North
Carolina: Davidson College.
Constructivist Learning Environment Questionnaire (CLES). The CLES is a 42-item questionnaire that asks
students to rate on a five-point scale where and how they learn science; the extent to which their classroom
environment enables them to present their own views, manage their own learning, and talk with other
students about science; and their interest and motivation in the classroom.
URL: http://www.letus.northwestern.edu/msta/surveys/source-documents/cles.html
Tayler, P.C., Fraser, B.J., and White, L.R. (1994). CLES: An instrument for monitoring the development
of constructivest learning environments. Paper presented at the annual meeting of the American
Educational Research Association, New Orleans. Available at:
http://www.letus.northwestern.edu/msta/surveys/source-documents/index.html
Learning Approach Questionnaire (LAQ). This 49-item questionnaire was developed to measure students’
“preferences of using learning strategies that are consistent with either rote or meaningful learning
orientations” (Bretz, 1994). Students are asked to rate their level of agreement/disagreement to 44
statements, and the remaining 5 items ask demographic information. An example item is: “I try to relate new
material, as I am reading it, to what I already know on that topic.”
Bretz, S.L. (1994). Learning strategies and their influence upon students’ conceptions of science
literacy and meaningful learning: the case of a college chemistry course for nonscience majors.
Dissertation: Cornell University.
Metacognition
Metacognition Awareness Inventory (MAI). The Metacognition Awareness Inventory (MAI) is a 52-item test
intended to measure the metacognitive awareness. Students are asked to rate statements on a 7 point Likertlike scale (1=not at all true of me, to 7=Very true of me). The MAI includes two factors: knowledge about
cognition and regulation of cognition.
Shraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational
Psychology, 19, 460-475.
Motivated Strategies for Learning Questionnaire (MSLQ). The Motivated Strategies for Learning
Questionnaire (MSLQ) is an 81-item self-report instrument that has two sections: a motivation section (31
items) and a learning strategies section (50 items). Students are asked to rate statements on a 7 point Likertlike scale (1=not at all true of me, to 7=Very true of me). The learning strategies scales incorporate the
20
metacognition construct and seek to measure rehearsal, elaboration, organization, critical thinking,
metacognitive self-regulation, effort regulation, peer learining, and help seeking.
Pintrich P., Smith D., Garcia T., & McKeachie W. (1991) A Manual for the Use of the Motivated
Strategies for Learning Questionnaire (Technical Report 91-B-004). The Regents of The University
of Michigan.
Duncan, T.G., & McKeachie, W.J. (2005). The making of the Motivated Strategies for Learning
Questionnaire. Educational Psychologist, 40(2), 117-128.
Motivation
Achievement Goal Questionnaire (ACG). The Achievement Goal Questionnaire (ACG) has 12 items that
look at motivation. Three items assess mastery-approach, three items assess mastery avoidance, three items
measure performance approach, and three items measure performance avoidance. Students are asked to rate
statements on a 7 point Likert-like scale (1=not at all true of me, to 7=Very true of me).
Elliot, A., & McGregor, H.A. (2001). A 2x2 achievement goal framework. Journal of Personality and
Social Psychology, 80, 501-519.
Concept Maps
Biology
Using concept maps to assess declarative knowledge. The authors of the paper below used concept maps to
assess lower-order knowledge and compared the concept maps against multiple-choice tests, finding high
reliability and inter-rater agreement. Students were provided with a small number of terms (5-7) and had to
link them in appropriate ways. For example, to "replace" a multiple-choice item which asked, "Which of the
following is not a type of protozoan? A) paramecium; b) amoeba; c) Euglena; d) bacteria" (d is correct), a
student would be provided with the terms (kingdom, Protozoa, Monera, Euglena, paramecium, amoeba,
bacteria) and asked to draw a concept map using these terms. Three scoring rubrics were used to measure
understanding and had high inter-rater agreement. Scoring rubric A assigned a "0" if students used all
pertinent answers and the stem (question) on their map (and assigned a "-1" otherwise). Scoring rubric B
assessed whether students linked relationships to the appropriate kingdoms in their map: a "+1" was given if
they correctly linked bacteria to kingdom Monera (and not Protozoa); a "–1" was given if they incorrectly
did so; and a "0" was given if the terms were not on the map. Finally, scoring rubric C was used to assess
misinformation: a "–1" was given if the student linked anything incorrectly to the kingdoms, and a "0" was
given otherwise.
Rice, D.C., Ryan, J.M., & Samson, S.M. (1998). Using concept maps to assess student learning in the
science classroom: must different methods compete? Journal of Research in Science Teaching,
35(10), 1103-1127.
Using a concept mapping to identify students' misconceptions in biology. This article presents numerous
student-generated concept maps and the misconceptions that they reveal.
Lavoie, D.E. (1997). Using a modified concept mapping strategy to identify students' alternative scientific
understandings of biology. Paper presentation at the 1997 Annual Meeting of the National Association
for Research in Science Teaching, Chicago, Illinois. Accessible at:
http://www.educ.sfu.ca/narstsite/conference/97conference/lavoie.pdf
Concept maps as a diagnostic tool. The researchers in the study below investigated the sensitivity of
student-generated concept maps as methods for documenting changes in student understanding. The
researchers used an experimental pre/post design. Students in an elementary science methods course were
randomly divided into two groups. Both sets took a "Life Zones in the Ocean" multiple-choice exam and
21
constructed concept maps on the topic at the start of the experiment. The experimental group received 45
minutes of computer instruction on the marine life zones; the control group received similar instruction on an
unrelated topic. Following this, both sets took a "Life Zones in the Ocean" multiple-choice exam and
constructed concept maps on the topic at the start of the experiment. The researchers found that the
experimental students' concept maps were much more detailed than the control students. Example studentgenerated concept maps are provided, as is an explanation of the scoring rubric used.
Wallace, J.D., & Mintzes, J.J. (1990). The concept map as a research tool: exploring conceptual change
in biology. Journal of Research in Science Teaching, 27 (10), 1033-1052.
Concept Tests
Astronomy
ConcepTests for Astronomy. A library of ConcepTests for astronomy is available at the URL below.
Users/contributers need a login id and password and can access this library by contacting Paul Green
(pgreen@cfa.harvard.edu).
URL: http://hea-www.harvard.edu/~pgreen/educ/ConcepTests.html
Green, P. (2002, to be published). Peer Instruction for Astronomy. Prentice Hall: Upper Saddle River,
NJ
Chemistry
Journal of Chemical Education On-Line Library of Conceptual Questions (CQs). The American Chemical
Society's Division of Chemical Education offers a library of Conceptual Questions (or CQs) in their Journal
of Chemical Education OnLine that would be useful as ConcepTest questions. The site offers questions in a
variety of topics, techniques for creating CQs, and accepts submissions of CQs.
URL: http://jchemed.chem.wisc.edu/JCEWWW/Features/CQandChP/CQs/CQIntro.html
Chemistry ConcepTests. The ConcepTests website offers a range of conceptest questions on a variety of
topics. Readers can download sets of questions on various chemistry topics in Adobe Acrobat (*.pdf) and
Microsoft Word (*.doc) format.
URL: http://www.chem.wisc.edu/~concept/
Landis, C.R., Ellis, A.B., Lisensky, G.C., Lorenz, J.K., Meeker, K., Wamser, C.C. Chemistry
ConcepTests: A Pathway to Interactive Classrooms, Prentice Hall, Inc., in preparation. Preprints of
this booklet are available; email concept@chem.wisc.edu for more information.
ConcepTests for General Chemistry. This site offers additional conceptests (without answers) for classroom
use. URL: http://people.brandeis.edu/~herzfeld/conceptests.html
Mathematics
Better file cabinet. This site has a searchable database of questions covered in calculus classes.
URL: http://betterfilecabinet.com/
22
Physics
Project Galileo: Your gateway to innovations in science education. The Project Galileo site has a variety of
teaching materials that are based on research in science education. The site, maintained by Harvard
Professor Eric Mazur and colleagues, includes an extensive set of resources for teaching with ConcepTests in
astronomy, biology, chemistry, and physics. Physics ConcepTests are stored in a database, from which users
can select sets of ConcepTests for use in their classes.
URL: http://galileo.harvard.edu
Mazur, E. (1997) Peer Instruction: A User's Manual; Prentice Hall: Upper Saddle River, NJ.
Conceptual Diagnostic Tests
Astronomy
Astronomy Diagnostic Test (ADT). The ADT consists for 33 questions, with the first 21 questions collecting
the content portion, and the final 12 questions collecting demographic information. The test can be accessed
at the URL below. This site also offers comparative data by institution to the ADT question and links to
research articles on the development of the ADT.
URL: http://solar.physics.montana.edu/aae/adt/
Lunar Phases Concept Inventory. The LPCI was developed based upon interviews with 14 undergraduate
students around their conceptual understanding of lunar phases. The LPCI consist of 14 multiple-choice
items.
Lindell, R.S. (2001). Enhancing College Students' Understanding of Lunar Phases. Unpublished doctoral
dissertation, University of Nebraska – Lincoln.
Biology
Conceptual Inventory of Natural Selection (CINS). The Conceptual Inventory of Natural Selection (CINS)
was developed to identify student’s knowledge about natural selection, as well as their misconceptions. The
20 item multiple choice asks students to reflect on scenarios regarding Galapagos finches, Venezualan
guppies, and Canary Island lizards, and select options that best reflect what an evolutionary biologist would
select.
Anderson, D.L., Fisher, K.M., & Norman, G.J. (2002). Development and evaluation of the conceptual
inventory of natural selection. Journal of Research in Science Teaching, 39, 952-978. Available on-line at:
http://scires.educ.msu.edu/EnvironmentalLiteracy/sgnatsel/conc_inv_natsel.pdf .
URL: http://www.biologylessons.sdsu.edu/CINS6_03.pdf
Chemistry
Chemical Concepts Inventory. The Chemical Concepts Inventory (CCI) can be used to identify chemistry
misconceptions held by students. The inventory is a multiple-choice instrument composed of nonmathematical conceptual questions (22 questions total). The questions are based on commonly-observed
student misconceptions about topics generally covered in the first semester of a college chemistry course. A
question from the inventory appears in the figure below.
URL:
http://jchemed.chem.wisc.edu/JCEWWW/Features/CQandChP/CQs/ConceptsInventory/CCIIntro.ht
ml
Mulford, D.R. (1996). An Inventory for Measuring College Students' Level Of Misconceptions in First
Semester Chemistry. Unpublished doctoral dissertation, Purdue University. Available at:
http://faculty.pepperdine.edu/dmulford/thesis/title.html
23
Earth Science
Test on students' alternative conceptions of earth and space. This article lists the common misconceptions as
identified by a survey on students' alternative conceptions of earth and space. The survey was described as
having 18 multiple-choice questions, but was not listed in the article.
Schoon, K.J. (1992). Students' alternative conceptions of earth and space. Journal of Geological
Education, 40, 209-214.
Engineering
The Foundation Coalition has a variety of concept inventories spanning a wide variety of areas in
engineering: chemistry, circuits, computer engineering, dynamics, electromagnetics, electronics, fluid
mechanics, heat transfer, materials, material strength, signals and systems, and thermodynamics.
Information about these instruments can be accessed at:
http://www.foundationcoalition.org/home/keycomponents/assessment_evaluation.html
Environmental Science
Test on the Greenhouse Effect, Ozone Depletion, and Acid Rain. This diagnostic test asks students to report
"yes," "no," or "don't know" to a set of 29 research-based misconception questions about the greenhouse
effect, ozone depletion, and acid rain. The survey was adopted from a questionnaire developed by Jane
Dove (1996).
Khalid, T. (1999). The Study of Pre-Service Teachers' Alternative Conceptions Regarding Three
Ecological Issues. Paper presented at the Annual Meeting of the National Association for Research
in Science Teaching, Boston, Massachusetts.
<http://www.educ.sfu.ca/narstsite/conference/khalid/khalid.html>
Dove, J. (1996). Student teacher understanding of the greenhouse effect, ozone layer depletion and acid
rain. Environmental Education Research. 2,89-100.
Mathematics
Basic Skills Diagnostic Test (BSDT). The Basic Skills Diagnostic Test (BSDT) is a 24-item free-response
test on basic mathematics understanding. Eleven questions (two with two parts) assess pre-algebra level
(division, multiplication and ordering of numbers represented in fractions and in decimals), concept of area
and volume, proportional reasoning. Eight questions assess basic algebra (concept of a variable, order of
operations, simplifying expressions, particularly inappropriate cancellation, linear equations), and 3
questions assess intermediate level, concept of a function, graph of a function, concept of a logarithm.
The tool can be sent to colleges and universities by contacting the author, Jerome Epstein ((718) 2603572, jepstein@duke.poly.edu).
Epstein, J. "What is the real level of our students? or What do diagnostic tests really measure?" to be
published.
Physics
Force Concept Inventory (FCI) Instrument. The Force Concept Inventory (FCI) instrument is designed to
assess student understanding of the most basic concepts in Newtonian physics. This forced-choice
instrument has 30 questions and looks at six areas of understanding: kinematics, Newton's First, Second, and
Third Laws, the superposition principle, and types of forces (such as gravitation, friction). Each question
offers only one correct Newtonian solution, with common-sense distractors (incorrect possible answers) that
are based upon student's misconceptions about that topic, gained from interviews.
URL: http://modeling.asu.edu/R&E/Research.html.
Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force Concept Inventory. The Physics Teacher, 30,
141-158.
24
The Mechanics Baseline Test (MBT). The Mechanics Baseline Test (MBT) instrument is an advanced
companion to the Force Concept Inventory (FCI). The forced-choice MBT has 26 questions that were based
upon interviews with students about their misconceptions on basic topics in Newtonian mechanics. The test
covers concepts in kinematics (linear and curvilinear motion), basic principles (Newtons' First, Second, and
Third Laws, superposition principle, energy conservation, impulse-momentum, and work) and special forces
(gravity and friction).
URL: http://modeling.asu.edu/R&E/Research.html.
Hestenes, D. & Wells, M. (1992). A Mechanics Baseline Test. The Physics Teacher, 30, 159-165.
Conceptual Survey of Electricity and Magnetism (CSEM). The 32-item multiple-choice CSEM was
developed to assess students' knowledge about electricity and magnetism.
Maloney, D.P., O'Kuma, T.L., Hieggelke, C.J., & Heuvelen, A.V. (2001). Surveying students'
conceptual knowledge of electricity and magnetism. American Journal of Physics, 69 (S1), S12-S23.
Other diagnostic tests. Physicists are quite active in the development of diagnostic tests. A large number
can be found at the North Carolina State University's Physics Education Research group at
http://www.ncsu.edu/per/ and specifically at their test instruments page at www.ncsu.edu/per/TestInfo.html.
There are many other Physics Education Research groups around the country.
Known misconceptions in physics. Additionally, the "Student Difficulties in Physics Information Center"
lists common misconceptions students have in areas of physics, as well as the research articles that have
investigated/uncovered those misconceptions.
URL: http://www.physics.montana.edu/physed/misconceptions/index.html
Reasoning
Classroom Test of Scientific Reasoning (revised). 12-item test asks about conservation of weight/volume,
proportional thinking, probabilistic/proportional thinking, combinatorial thinking, and correlational thinking.
Scores on test classify students into three groups: empirical/inductive reasoners, transitional reasoners, and
hypothetical/deductive reasoners.
Contact Anthony Lawson directly for tool at: anton1@asu.edu
Lawson, A.E. (1987a). Classroom test of scientific reasoning: revised pencil-paper edition. Arizona State
University.
Interviews
Biology
Interviews of students' conceptions of genetics. The article provides an interview protocol, as well as
excerpts from interviews, of the concepts of inheritance, genetic structure, genetic processes, and extension.
Venville, G.J., & Treagust, D.F. (1998). Exploring conceptual change in genetics using a
multidimensional interpretive framework. Journal of Research in Science Teaching, 35 (9), 10311055.
25
Chemistry
Interviews of students' conceptions of equilibrium and fundamental thermodynamics. Article provides a set
of interview questions surrounding a chemical reaction and the codes used to score those interviews.
Interviews provided information on the type and extent of student misconceptions regarding the first and
second law of thermodynamics, among others.
Thomas, P.L. & Schwenz, R.W. (1998). College physical chemistry students' conceptions of
equilibrium and fundamental thermodynamics. Journal of Research in Science Teaching, 35 (10),
1151-1160.
Performance Assessment
Critical Thinking
Performance Assessments of Critical Thinking. The article below describes the development and testing of
four critical thinking performance tasks, and provides a rubric for scoring the tasks. The rubric used had high
inter-rater reliability, with 95% of papers having exact or close agreement on scores between raters.
Correlations of the tasks against various closed-ended critical thinking tests were low (0.25-0.30); however,
high school teachers who used the tasks found them to provide good information on students' skills.
Christen, R., Angermeyer, J., Davison, M., & Anderson, K. (1994). Performance assessment of critical
thinking: the reflective judgment approach. Research/Practice, 2(1)
Scoring Rubrics
Rubrics for Oral Presentations
Rubric for Oral Presentation. The scoring rubric is an adaptation of several evaluation instruments that have
been demonstrated strong inter-rater reliability and validity (Carlson and Smith-Howell, 1995).
Carlson, R.E., & Smith-Howell, D. (1995). Classroom Public Speaking Assessment: Reliability and
Validity of Selected Evaluation Instruments. Communication Education, 44, 87-97.
Quigley, B.L. (1998) Designing and Grading Oral Communication Assignments. New Directions for
Teaching and Learning, 74, 41-49.
Rubrics for Written Presentations
"Presentation Sandwich" Checklist. The "Presentation Sandwich" checklist is the mnemonic used in an
introductory engineering course to describe the presentation requirements for all technical work. The
checklist has two major sections: Section I concerns the Presentation Sandwich itself and Section II concerns
Graphical Work. Section I has three sub-sections (A, B, and C), one concerning problem context, one
concerning the technical work (i.e., the "sandwich filling"), and one concerning the discussion. Students who
do not meet all of the yes/no are required to re-submit their work. Space is left for students demonstrating
exemplary work. The checklist is provided to students prior to their assignment.
McNeill, B., Bellamy, L., & Burrows, V. (1999). A Quality Based Assessment Process for Student
Work Products. ASEE Journal of Engineering Education, 88(4).
http://ceaspub.eas.asu.edu/mcneill/word_documents/papers/using_checklists_v10.doc
Scoring Rubric for Evaluating Writing.
Smith, K.A. (1998) Grading Cooperative Projects. New Directions for Teaching and Learning, 74, 41-49.
Scoring Rubric for an Assignment which Examines a Consumer Product. The rubric shown here involves an
assessment of student reports on a product (like a consumer report). These tools were originally developed
by pharmacy faculty at Eastern Illinois University to develop and grade students' reports about a health-care
product.
26
Hobson, E.H. (1998) Designing and Grading Written Assignments. New Directions for Teaching and
Learning, 74, 51-57.
Scoring Rubric for Evaluating a Persuasive Argument Paper. This rubric evaluates a paper that is intended
to present a persuasive argument on some topic. The rubric is divided into four categories (organization,
content, usage, and mechanics), each of which is weighted to reflect the relative importance of that category.
Johnson, D.W., & Johnson, R.T. (1996). Meaningful and Manageable Assessment Through Cooperative
Learning. Edina, MN: Interaction Book Company.
Computer Modeling Checklist. The "Computer Modeling" checklist is used for computer modeling
assignments. This checklist has five major sections. Section I concerns the correctness of the model. Section
II defines what is expected when charts are present. Section III states how Spreadsheet models must be
presented. Section IV is concerned with the written responses given to the questions posed in the assignment.
McNeill, B., Bellamy, L. & Burrows, V. (1999). A Quality Based Assessment Process for Student Work
Products. ASEE Journal of Engineering Education, 88(4).
http://ceaspub.eas.asu.edu/mcneill/word_documents/papers/using_checklists_v10.doc
Scoring rubrics for projects that model a real life solution. Charles Emenaker (University of Cincinnati,
Raymond Walters College) outlines two scoring rubrics for projects which require students to model a reallife situation mathematically, to interpret and to present their results. One rubric sums the score from four
categories: problem understanding, problem plan, problem solution and problem presentation. The other
rubric uses a "holistic" four-point scoring system and can be used for problems that require a less-detailed
assessment.
Emenaker, Charles E. Assessing modeling projects in calculus and pre-calculus: Two approaches.
Supporting Assessment in Undergraduate Mathematics website. Accessed on 4/3/02 at
http://www.maa.org/SAUM/maanotes49/116.html
Rubrics for Problem Solving
Rubrics for Assessing Skill in Addressing Open-Ended Problems. This rubric for the assessment of students'
skills in addressing open-ended problems is founded on empirical data on the developmental process in
problem solving. This four-step process is outlined in Figure 1 below. Essentially, these steps are: Step 1 -Identify the problem, relevant information, and uncertainties; Step 2 – Explore interpretations and
connections; Step 3 – Prioritize alternatives and communicate conclusions; and, Step 4 – Integrate, monitor,
and refine strategies for re-addressing the problem (Lynch and Walcott, 2001). The rubric measures
students' skill level against each of the four steps outlined above.
URL: http://www.wolcottlynch.com/index_files/page0002.html
Lynch, C.L. & Wolcott, S.K. (2001). Helping your students Develop Critical Thinking skills. IDEA
Paper, 37. [On-line]. Available: http://www.idea.ksu.edu/papers/Idea_Paper_37.pdf
Scoring Rubric for Evaluating an Inquiry Project. This rubric can be used to evaluate an inquiry project:
namely, a project in which students formulate and pursue questions using multiple sources of information
and multiple investigative methods. This rubric is based on the research on the nature of inquiry in the
learning of professional practitioners and is intended to give weight to the critical processes of investigation
while preventing the uncontrolled factors of working in a real-world environment from counting against the
student.
Busching, B. (1998). Grading Inquiry Projects. New Directions for Teaching and Learning, 74, 89-96.
27
Scoring Rubric for Writing in a Mathematics Class. Annalisa Crannell (at Franklin & Marshall College) has
developed writing assignments and a rubric for grading them in her mathematics classes.
Rubric URL: http://server1.fandm.edu/departments/Mathematics/writing_in_math/checklist.html
Assignment URL:
http://www.fandm.edu/DEPARTMENTS/MATHEMATICS/writing_in_math/writing_index.html
Crannell, Annalisa. (1994). How to Grade 300 Mathematical Essays and Survive to tell the Tale.
PRIMUS, 4(3), 193-201.
Rubrics for evaluating questions students ask. This rubric evaluates the number and type of questions
students pose in thinking about a problem-based case study situation. Students were asked to read a
scenario and then to write down every question they had about the scenario. The set of questions students
posed were evaluated along three dimensions: Orientation (i.e., whether question is descriptive, or whether
asks about potential solutions/treatments) Relation to Case Study (i.e., whether question could be answered
by the case study); and Complexity (i.e., whether question involved a response that involved higher-order
thinking). A metric was assigned to the questions that enabled a numeric grading of the set of questions.
Dori, Y.J., & Herscovitz, O. (1999). Question-posing capability as an alternative evaluation method:
analysis of an environmental case study. Journal of Research in Science Teaching, 36(4), 411-430.
Rubrics for Discussions
Scoring Rubric for Evaluating a Discussion Conducted via a Listserv. This rubric evaluates a list-serv
discussion in four areas: mechanics (grammar), participation (five postings required per week), content, and
critical thinking. One can evaluate all the posts or a random sample of posts. The selected postings are rated
on the four constructs and then summed for a total score. An instructor can also create two scores from the
rubric: the sum of mechanics and participation describes the students' use of technology and the sum of the
content and critical-thinking ratings describes the students' content understanding.
Morrison, G.R., & Ross, S.M. (1998) Evaluating Technology-Based Processes and Products. New
Directions for Teaching and Learning, 74, 69-77.
Scoring Rubric for Whole-Class Participation. This rubric is a holistic scoring rubric to be used in classes
that use a combination of whole-class discussion with occasional small group work. A holistic scoring rubric
rates student skills, attitudes, knowledge, and behavior in one score, rather than as a sum of different scores
measuring specific competencies.
Bean, J.C. & Peterson, D. (1998). Grading Classroom Participation. New Directions for Teaching and
Learning, 74, 33-40.
Rubrics for Evaluating a Portfolio
Scoring Rubric for Evaluating a Portfolio. These two rubrics were developed in a course as a negotiation
between the students (pre-service elementary education students) and the faculty, and were based upon the
recent literature on how to grade proposals. The rubrics emphasize in particular students' growth over time,
and attempt to take into consideration contextual factors that students may have faced in the class (which
involved work in their field). In addition, having the students grade themselves promoted their ability to
reflect on their own learning, a life long skill.
Scanlan, P.A. & Ford, M.P. (1998). Grading Student Performance in Real-World Settings. New
Directions for Teaching and Learning, 74, 97-105.
28
29
30
Recommended Reading List
(Submitted by Conference Speakers)
Alters & Nelson. 2002. Teaching Evolution in Higher Education. Evolution 56:1891-1901
Black, P. & William, D. (1998). Assessment and classroom learning. Assessment in Education: Principles,
policy, and practice, 5(1), 7-74.
Bransford, J.D., Brown, A.L., Cocking, R.R. Eds. (1999). How People Learn: Brain, Mind, Experience, and
School. Washington, D.C.: National Academy Press. Available on line at: http://www.nap.edu
Bruer, J. T. (1993). Schools for Thought. Cambridge, MA: MIT Press.
Cameron, W. & Chudler, E. (2003) A role for neuroscientists in engaging young minds. Nature Reviews
Neuroscience, 4: 1-6.
DeHaan, R.L. (2005) The Impending Revolution in Undergraduate Science Education. Journal of Science
Education and Technology, 14 (2): 253-269.
Donovan, M. S., Bransford, Pellegrino, Eds. (2001). How People Learn: Bridging Research and Practice,
Washington, D.C.: National Academy Press.
Available on line at: http://www.nap.edu
Gabel, D.L., Ed. (1994). Handbook of Research on Science Teaching and Learning. New York: MacMillan.
Klymkowsky, Garvin-Doxas & Zeilik. 2003. Bioliteracy and Teaching Efficacy: What biologists can learn from
physicists. Cell Biology Education, 2:151-163
Lottridge, S.M. & Seymour, E. (1999). The Student Assessment of Learning Gains Website.
[http://www.wcer.wisc.edu/salgains/instructor/default.asp] [An on-line course evaluation website]
McClymer & Knoles. 1992. Ersatz Learning, Inauthentic Testing. Journal of Excellence in College Teaching
3:33-50.
National Institute for Science Education, College Level 1 Team. (2002). The Field-tested Learning
Assessment Guide Website. [http://www.flaguide.net] [This site provides introduction to assessment,
self-instructional modules on different assessment techniques, and links to assessment instrument].
Seymour, E. (2001). Tracking the processes of change in US undergraduate education in science,
mathematics, engineering, and technology. Science Education, 86, 79-105. [This article outlines the
changing views on assessment and teaching and learning in science, math and technology.]
Skrabanek. 1986. Demarcation of the Absurd. The Lancet No. 8487, pp. 960-61.
Springer, L, Stone, M.E., & Donovan, S. (1997). Effects of Small-Group Learning on Undergraduates in
Science, Mathematics, Engineering, and Technology: A Meta-Analysis (NISE Research Monograph
No. 11). University of Wisconsin, Madison: National Institute for Science Education [This is a
meta-analysis on the impact of small groups on learning]
31
White, R. and Gunstone, R. (1992). Probing Understanding. Falmer Press. Contains a chapter on the POE
(Predict, Observe, Explain) Framework.
Approaches to Biology Teaching (with key words and web sites)
Allen, D. & Tanner, K. (2004). Approaches to cell biology teaching: Mapping the journey—Concept maps
as signposts of developing knowledge structures. Cell Biology Education, 2, 133-136.
Available at: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=192451
Baines, A. T., McVey, M., Rybarczyk, B,, Thompson, J.R., & Wilkins, H. R. Mystery of the toxic flea dip:
An interactive approach to teaching aerobic cellular respiration. Cell Biology Education, 3, 62-65.
... Diagnosing secondary students’ misconceptions of photosynthesis and respiration ... Teaching
Available at:
http://scholar.google.com/scholar?hl=en&lr=&q=cache:Y1u5ROf4xDwJ:cellbioed.org/articles/vol3no1/pdfs/
03-06-0022.pdf+biology+%26+misconceptions+%26+teaching+OR+learning+%22research%22
Dancy, M. H. & Beichner, R. J. (2002). But Are They Learning? Getting Started in Classroom Evaluation.
Cell Biology Education, 1, 87-94.
Available at: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=128540&tools=bot
Engle, D. (2004). The biological justification for transforming our teaching. Cell Biology Education,
3(4),233-234. Review of Zull, James E. The art of changing the brain: Enriching the practice of
teaching by exploring the biology of learning. (Stylus Publishing. ISBN: 1-5792-2054-1.
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=533124&tools=bot
Feig, A. (2004). Challenge your teaching. Nature Structural & Molecular Biology. 11(1), 16-19.
... ingrained learning patterns or misconceptions distilled from ... the allied departments biology, chemistry,
mathematics ... learning resulting from teaching innovation ...
Available at: www.anahuac.mx/medicina/archivos/dra_anafuentes.pdf
Hall, A. C. & Harrington, M.E. (2003). Experimental Methods in Neuroscience: An Undergraduate
Neuroscience Laboratory Course for Teaching Ethical Issues, Laboratory Techniques, Experimental
Design, and Analysis. Journal of Undergraduate Neuroscience Education, 2(1), A1-A7.
Available at: http://www.funjournal.org/results.asp?juneid=73
Kitchen, E., Bell, J.D., Reeve, S. & Sudweeks, R.R. & Bradshaw, W.S.(2003). Teaching cell biology in the
large-enrollment classroom: Methods to promote analytical thinking and assessment of their
effectiveness. Cell Biology Education2, 180-194.
Available at:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=14506506&dopt=Ab
stract
Khodor, J. Halme, D. G., & Walker G. C. (2004). A Hierarchical Biology Concept Framework: A Tool for
Course Design, Cell Biology Education, 3, 111-121.
... erroneous answers diagnose the misconceptions held by ... effectiveness of different
teaching methods ( Hake ... cellular, and developmental biology” ( Klymkowsky et ...
Klymkowsky, M.W., Garvin-Doxas, K., & Zeilik, M. (2003). Bioliteracy and Teaching Efficacy: What
Biologists Can Learn from Physicists, Cell Biology Education, 2, 155-161.
32
Available at:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=14506504&dopt=Ab
stract
Markwell, J. (2004). The human side of science education. Biochemistry and Molecular Biology Education,
32, 323-325.
... has permitted an evolution of teaching style ... material, attempt to uncover misconceptions, and
orchestrate ...
Morse, M. P. (2003). Preparing biologists for the 21st century. BioScience, 53 (1), 10.
... students enter our classrooms with misconceptions about the ... is deeper when students learn biology by
acting ... agrees that the most effective teaching strategy is …
Staub, N.L. (2002). Teaching evolutionary mechanisms. Genetic drift and M&Ms, Bioscience 52(4), 373377.
... Teaching Evolutionary Mechanisms: Genetic Drift and M&M's. ... to correct such
misconceptions about evolution ...
Sundberg, M.D. (2002). Assessing student learning. Cell Biology Education, 1, 11-15.
... once had two large sections of nonmajor biology students (more ... categories ranging
from common misconceptions about the ... learning to our classroom teaching as we ...
Available at: http://www.pubmedcentral.gov/articlerender.fcgi?tool=pubmed&pubmedid=12587030
Tanner, K., Chatman, L.S. & Allen, D. (2003). Approaches to cell biology teaching: Cooperative learning in
the science classroom – Beyond students working in groups. Cell Biology Education, 2, 1-5.
Available at: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=152788&tools=bot
Tanner, K. & Allen, D. (2004). Approaches to biology teaching and learning: From assays to assessments—On
collecting evidence in science teaching. Cell Biology Education, 3, 69-74.
Tanner, K. & Allen, D. (2005). Moving from knowing facts toward deep understanding through conceptual
change. Cell Biology Education, 4(2), 112-117.
... a longitudinal study of conceptual change in biology. ... tests to evaluate students'
misconceptions in science ... Handbook of Research on Science Teaching and Learning ...
Available at: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1103711#N0x88e6dc0.0x840c45c
Tidon, R. & Lewontin, R. C. (2004) Teaching evolutionary biology. Geneics and Molecular Biology, 27 (1),
124-131.
… Issues related to biological misconceptions, curriculum and didactic material are
discussed, and some proposals are presented with the objective of aiding .... (2004).
33
Cooperative/Collaborative/Group Learning:
L. D. Fink. (2002) Beyond small Groups: Harnessing the extraordinary power of learning teams. In L. K.
Michaelsen, A. B. Knight, & L.D. Find (Eds). Team-Based Learning: A Transformative Use of Small
Groups, Westport, CT: Praeger Publishers.
Johnson, D. W., Johnson, R. T., and Smith, K. A. (1998). Active learning: Cooperation in the college classroom,
2nd ed. Edina, MN: Interaction Book Company.
Millis, Barbara J. and Cottell, Philip G., Jr. (1998). Cooperative Learning for Higher Education Faculty.
American Council on Education. Oryx Press; Phoenix, AZ.
(For higher education faculty in all areas. The authors have used cooperative learning in diverse settings.
Good balance of research results and practical application.)
Springer, L., Stanne, M.E., Donovan, S. S.(1999). Effects of Small-Group Learning on Undergraduates in
Science, Mathematics, Engineering, and Technology: A Meta-Analysis. Review of Educational Research, 69
(1), 21–51.
(Examines 39 high-quality studies from 1980 and later in STEM areas. Finds overall, robust gains for
achievement, persistence, and attitudes).
Misconceptions:
Bodner, G.M. (1991). I have found you an argument, Journal of Chemical Education, 68 (5), 385-388.
Gabel, D. (1998). The complexity of chemistry and implications for teaching. In B.J. Fraser and K.G. Tobin
(eds). International Handbook of Science Education, Kluwer, Academic Publishers, Great Britian, 233-248.
Hestenes, D. (1995) What do graduate oral exams tell us?, American.Journal of Physics. 63, 1069.
Lewis, E. L., & Linn, M. C. Heat energy and temperature concepts of adolescents, naïve adults, and experts:
Implications for curricular improvements. Journal of Research in Science Teaching, 31(6), 657-677 (1994).
Mulford, D.R.; Robinson, W.R., (in press) An inventory for misconceptions in first-semester general chemistry.
Journal of Chemical Education. Submitted (2001). Materials on:
http://faculty.pepperdine.edu/dmulford/thesis/title.html
Osborne, R. & Cosgrove, M. (1983). Common misconceptions among college science students. Journal of
Research in College Science Teaching, 20, 825-830.
Smith, K.J. and Metz, P.A. (1996). Evaluating student understanding in solution chemistry through microscopic
representations, Journal of Chemical Education, 73(3), 233-235.
Students’ Alternative Frameworks and Science Education, 4th Edition, H. Pfundt and R. Duit, (IPN Reports-in-Brief,
Kiel Germany, 1994). Available on-line at: http://www.ipn.uni-kiel.de/aktuell/stcse/stcse.html
Misconceptions Web Sites:
http;//www.oise.utoronto.ca/~science/miscon.html
http://www.physics.montana.edu/phyed/misconceptions
http://www.ipn.uni-kiel.de/aktuell/stcse/stcse.html
http://faculty.pepperdine.edu/dmulford/thesis/title.html
34
Other Useful Web Sites:
FLAG (Field-tested Learning Assessment Guilde: http://flagguilde.org Web site contains assessment
primer, CATs (Classroom Assessment Techniques, Matching Goals to Assessments, Resources, etc.
SALG (Student Assessment of Learning Gains): http://www.flaguide.org/cat/salg/salg1.php or
http://www.wcer.wisc.edu/salgains/instructor/ Web site that contains a powerful tool that allows assessment
of any/all components of classroom/laboratory practices’ impact on learning. Instructors create their own
versions of the SALG from template and can save over them for multiple courses over multiple semesters.
Students take SALG on-line and data is returned to faculty as raw or statistically analyzed data.
Bibliography STCSE (Students' and Teachers' Conceptions and Science Education). Documents research on
teaching and learning science. Role students' and teachers' conceptions in the teaching and learning process
given particular attention. http://www.ipn.uni-kiel.de/aktuell/stcse/stcse.html Searchable and downloadable
in html or Endnote.
http://www.first2.org/resources/wrkshopmaterials/workshop_table.html
Richard Hake’s web site: http://www.physics.indiana.edu/~sdi/
Rich Felder’s web site: http://www2.ncsu.edu/unity/lockers/users/f/felder/public/
1999-2000 SUCCEED Faculty Survey of Teaching Practices and Perceptions of Institutional Attitudes
toward Teaching. SUCCEED Coalition Report, December 2001. View full report (95 pages) or executive
summary (9 pages). Reports frequencies of use of active and cooperative learning, learning objectives,
writing assignments, and faculty development services, and perceptions of the value ascribed to teaching
quality by faculty, administrators, and the faculty reward system. Survey respondents were 586 engineering
faculty members at 8 institutions.
http://www2.ncsu.edu:8080/unity/lockers/users/f/felder/public/#WhatsNew
Project Galileo has ConcepTests that have been field-tested. http://galileo.harvard.edu/
Joe Redish’s web site: http://www.pysics.umd.edu/perg/papers/redish/cogsci.html
ABC's of teaching with excellence: A compendium of suggestions for teaching with excellence," B.G. Davis,
L. Wood, R.C. Wilson; online at: http://teaching.berkeley.edu/compendium including the Minute Paper
description.
PKAL (Project Kaleidoscope: http://www.pkal.org - pkal@pkal.org) has an on-line publication on "What
Works, What Matters and What Lasts." The goal of this collection of stories and essays is to capture, analyze
and disseminate lessons learned from the experience of leading agents of change: institutional, organizational
and individual so to inform the work of the larger community.
Teaching Large Lecture Classes, TA Professional Development, Cognitive Apprenticeship Model, etc. with
downloadable files: http://groups.physics.umn.edu/physed/
More on Bloom’s Taxonomy: http://www.kcmetro.cc.mo.us/longview/ctac/blooms.html
35
Web Sites With Scientific Teaching Examples:
Group problem-solving in lecture
www.ibscore.org/courses.htm
http://yucca.uoregon.edu/wb/index.html
http://mazur-www.harvard.edu/education/educationmenu.php
Problem-based learning
www.udel.edu/pbl/
www.microbelibrary.org
www.ncsu.edu/per/scaleup.html
http://webphysics.iupui.edu/jitt/jitt.html
Case studies
www.bioquest.org/lifelines/
http://ublib.buffalo.edu/libraries/projects/cases.case.html
http://brighamrad.harvard.edu/education/online/tcd/tcd.html
Inquiry-based labs
http://chemistry.Beloit.edu
http://mc2.cchem.berkeley.edu
www.plantpath.wisc.edu/fac/joh/bbtl.htm
www.bioquest.org/
http://biology.dbs.umt.edu/biol101/default.htm
http://campus.murraystate.edu/academic/faculty/terry.derting/ccli/cclihomepage.html
Interactive computer learning
www.bioquest.org/
www.dnai.org
http://evangelion.mit.edu/802TEAL3D/
http://ctools.msu.edu/
General Papers:
D. Ebert-May et al., Bioscience 47, 601 (1997).
P. Laws, Phys. Today 44, 24 (1991).
D. Udovic et al., Bioscience 52, 272 (2002).
J. C.Wright et al., J. Chem. Educ.75,986 (1998).
J. Trempy et al. Microbiol. Educ. 3, 26 (2002).
L. Springer et al., Rev. Educ. Res. 69, 21 (1999).
Bonwell, Charles C. & Eison, James A. (1991). Active learning: Creating excitement in the classroom. ASHEERIC Higher Education Reports; Report No. 1. Washington, D.C.: The George Washington University,
School of Education and Human Development.
Hake, R.R. (1998a). "Interactive-engagement vs traditional methods: A six-thousand-student survey of
mechanics test data for introductory physics courses," American Journal of Physics 66, 64-74 (1998); online
at <http://www.physics.indiana.edu/~sdi/>.
Handelsman, J. et al., Biology Brought to Life: A Guide to Teaching Students How to Think Like Scientists.
McGraw-Hill, New York, 1997.
36
Hestenes, D. (1979). Wherefore a science of teaching?, The Physics Teacher, 17, 235-242.
Karplus, R. (1977). Science Teaching and the Development of Reasoning, Journal of Research in Science
Teaching 14(2): 169-175. (Currently online at <http://www.kluweronline.com/issn/1059-0145/contents> as
an Appendix to Fuller (2003). Also in Fuller (2002), pages 70-76.
Lawson, A. E., Abraham, M. R,. & Renner, J.W. (1989). A theory of instruction: Using the learning cycle to
teach science concepts and thinking skills. National Association for Research in Science Teaching (NARST)
Monograph, Number one. ED324204.
Lemke, J. (1990). Talking Science - Language, Learning, and Values. Norwood NJ: Ablex publishers.
Learning Cycle: Schneider, L.S. & J.W. Renner (1980). Journal of Research in Science Teaching, 17 (6), 503517.
McDermott, L.C. (1991). Millikan Lecture 1990: What we teach and what is learned–Closing the gap, American
Journal of Physics, 59, 301-315.
Musheno,B., Lawson,A. B. (1999). Effects of learning cycle and traditional text on comprehension of science
concepts by students at differing reasoning levels. Journal of Research in Science Teaching, 36(1), 23-27.
Robinson, W. R. (1999). A view of the science education research literature: Student understanding of chemical
change. Journal of Chemical Education, 76(3), 297-298.
Smith, E., Blakeslee, T, & Anderson, C. (1993). Teaching strategies associated with conceptual change learning
in science," Journal of Research in Science Teaching 30, 111-126.
Strike, K. A. & Posner, G. J. (1992). A revisionist theory of conceptual change. In R. A. Duschl & R. J.
Hamilton (Eds.), Philosophy of science, cognitive psychology, and educational theory and practice (pp. 147176). Albany, NY: State University of New York Press.
Wyckoff, S. (2001). Changing the culture of undergraduate science teaching, Journal of College Science
Teaching, February: 306-312.
Scholarship of Teaching and Learning
Boyer, E.L. (1990). Scholarship Reconsidered: Priorities of the Professoriate. Princeton, NJ: Carnegie
Foundation for the Advancement of Teaching.
Cambridge, B. (December 1999). The Scholarship of teaching and Learning: Questions and Answers From the
Field. AAHE Bulletin 52(4): 7-10.
Carnegie Academy for the Scholarship of Teaching and Learning. (1999). Informational Program. Booklet.
Menlo Park, CA: Carnegie Foundation for the Advancement of Teaching.
Glassick, C.E., M.T. Huber, and G.I. Maeroff. (1997). Scholarship Assessed: Evaluation of the Professoriate.
Special Report of The Carnegie Foundation for the Advancement of Teaching. San Francisco: Jossey-Bass.
Huber, M.T. (July/August 2001). Balancing Acts: Designing Careers Around the Scholarship of Teaching.
Change 33(4): 21-29.
37
Hutchings, P. & Shulman, L.S. (September/October 1999). The Scholarship of Teaching: New Elaborations,
New Developments. Change, 31(5): 10-15.
Shulman, L.S. 2000. From Minsk to Pinsk: Why a Scholarship of Teaching and Learning. The Journal of
Teaching and Learning (JoSoTL) 1(1); online at: http://www.iusb.edu/~josotl/search_archives.ht
Quotable Quotes
“Science not as a noun…but as a process, a set of activities, a way of proceeding and thinking.”
(Tinker & Thornton, 1992, p. 155).
"There is no neutral pedagogical practice. Every single one is based on a given conception of the learning
process and of the object of such a process. Most probably, those practices much more than the methods
themselves are exerting the greatest lasting effects in the domain of literacy, as in any field of knowledge."
(Ferreiro, Emilia, 1991).
Elementary and high school students perceive science largely as a passive process of observing and recording
events. They view good scientists as ones who make careful observations and keep complete and accurate
records of all they observe.”
(Carey, Evans, Honda, Jay & Unger, 1989; Songer & Linn, 1991).
‘The structure of our institutions conveys a more powerful message than their contents."
(Jay Lemke)
38