Alumni Teaching Scholars watching a fellow member present her research project

Example Midcourse Evaluation Tools

Below are some example midterm course evaluation tools you can use. Each tool has a description and a delivery method. Most also have detailed examples of how to administer.

Title: University Online Midcourse Evaluation (available Spring 2018)

Who has access to the data: Instructor, Chair, University

Delivery Method: Online

Description:

The midcourse evaluation will be delivered via the same delivery tool (i.e., What Do You Think) as end of course evaluations. The midcourse evaluation will be available to all full semester courses during the seventh week of classes. The evaluation will have the following six questions, which are derived from the end of course evaluation. The instructors are encouraged to add additional questions prior to the evaluation period. A question bank of potential questions is available for use.

This tool provides the opportunity for instructors to add questions to the midcourse evaluation up through the day before the evaluation period. The evaluation period lasts a week and a half. Instructors review data and improve the class.

  • Week four -- reminder sent to instructors that the tool is available
  • Week five & six -- evaluation system is open for instructors to add questions
  • Week seven & eight -- midcourse evaluations available for instructors to deploy in class
  • End of week eight -- instructors have access to midcourse evaluation data to improve class room instruction

The scale for ALL questions is 0-4, with higher numbers representing positive responses. (0 - Rarely, 1, 2 - Sometimes, 3, 4 - Always)

Classroom Climate
  • My instructor welcomes students' questions.
  • My instructor offers opportunities for active participation to understand course content.
  • My instructor demonstrates concern for student learning.
Student Learning
  • In this course, I learn to analyze complex problems or think about complex issues.
  • My appreciation for this topic has increased as a result of this course.
  • I have gained an understanding of this course material.

Title: Small Group Instructional Diagnosis (SGID)

Who has access to the data: Instructor

Delivery method: small group; in class

Description:

Implementing the SGID involves about 30 minutes near midcourse. The instructor leaves the classroom, and after the facilitator introduces and explains the process, the class members form small groups and reach consensus on the following questions:

  • What do you feel are the strengths of the course?
  • What suggestions for improvement can you make?

After several minutes of discussion, the groups report to the entire class. The facilitator, following clarification with students, summarizes the suggestions. The students are polled to measure their agreement with the statement being summarized. For the polling, the facilitator will read a suggestion aloud and count how many students agree/disagree with that statement. Both of these numbers will be shared with the instructor. This prevents having a suggestion on the list be referred to as heavily when only 1 student supports it, in contrast to having half of the class or more support it.

The facilitator then organizes the data into a report for the instructor, and the two colleagues review the results of the SGID and consider strategies for improvement. The instructor discusses the results with students and indicates changes (if any). It is recommended that the instructor specify both why they are choosing to make decisions and why they are not making other decisions.

Sign up for an SGID »

Title: Peer Review of Teaching (aka Colleague Evaluation)

Who has access to the data: Instructor

Delivery method: Classroom observation or critique of classroom artifacts

Description:

Peer review is often identified with peer observations, but it is more broadly a method of assessing any aspect of the class for the instructor under review. This typically includes peer observations of teaching, and other evidence such as syllabi, assignments, student work, and exams. Your peer may use their own background knowledge of teaching to evaluate these items/events, or it may be beneficial to use the benchmarks provided a professional organization in your field. If you are interested in learning more or would like to request help looking for these professional benchmarks, please contact your department chair.

It is also worth noting a common distinction between two very different forms of peer review: formative and summative. Formative evaluation typically is oriented solely toward the improvement of teaching and is part of instructional mentorship and development. Summative evaluation, in contrast, is that done to inform personnel decisions. To improve the freedom and exploration of individual instructors, formative reviews may be shielded from scrutiny for a period of years until such time that there needs to be accountability to standards of excellence for personnel decisions. At this point in time, summative evaluations are more common since they are tied to decisions related to reappointment, promotion, or tenure (Bernstein et al. 2000). Because the more consequential nature of summative evaluations tends to diminish the formative value of the peer review process, it is important to maintain a clear distinction between these types of evaluation and be transparent with those under review.

Title: Student Assessment of Their Learning Gains (SALG)

Who has access to the data: Instructor

Delivery method: Online

Description:

The SALG is a free online survey designed to test curricula and pedagogy developed through a National Science Foundation grant and currently hosted by the Wisconsin Center for Educational Research. The SALG is designed for instructors in all disciplines to get feedback from their students on various elements of a course. Once you register, you can modify the survey to fit your course. The tool is preset to create learning goals based on the statements that you choose. Students take the survey online through a link you provide at the completion of the setup of the evaluation; the Wisconsin Center then provides a statistical report.

Register at SALG Website for Instructors »

The Center for Teaching Excellence has archived a seminar on how to use the SALG: CTE SALG Video

Title: Bare Bones Questioning Technique (Snooks et al., 2004)

Who has access to the data: Instructor

Delivery method: In class (e.g., note cards), online, or verbally

Description:

This is a five-minute evaluation process where the students will write on note cards, online, or verbalize their answers to the questions below. Instructors may write the questions on the board/screen, read them aloud, or pass them out on printed paper. At the end of this activity, the instructor will have access to the raw data, but not a summary report (unlike the SGID). The words in parenthesis next to the questions below should help the instructor with completing their own analysis of what should no longer happen, begin to happen, or continue to happen in their classroom.

Use the following three questions:

  1. What (if anything) is interfering with your learning? (STOP);
  2. What suggestions do you have to improve your learning? (START);
  3. What is your instructor doing that helps you to learn? (CONTINUE).

Title: Students' Evaluation of Educational Quality (SEEQ)

Who has access to the data: Instructor

Delivery method: Online

Description:

The SEEQ is a comprehensive student rating form providing useful information about teaching effectiveness. This tool focuses on learning environment, enthusiasm, organization, group interaction, individual rapport, breadth, examination, assignments, and overall (general) evaluation. This tool can be administered online or printed and distributed during a class session.

Here is a sample SEEQ: Schreyer Institute SEEQ Example

LEARNING ENVIRONMENT

1. The following statements are rated on the scale: Very Poor, Poor, Moderate, Good, Very Good, or Not Applicable

  • You find the course intellectually challenging and stimulating.
  • You have learned something which you consider valuable.
  • Your interest in the subject has increased as a consequence of the course.
  • You have learned and understood the subject materials in this course.

2. Do you have any comments to add about the LEARNING ENVIRONMENT of the course?

Title: Quick Course Diagnosis (QCD)

Who has access to the data: Instructor

Delivery method: In class

Description:

For a QCD, the instructor meets with the faculty developer in the Center for Teaching Excellence to discuss objectives and any changes to the basic protocol. The instructor prepares the class for the 15-minute experience and leaves the room during the QCD. Later, the instructor meets with the faculty developer to review the data and to plan improvements.

For the processing, the faculty development team (one person to ten, depending on the class size) greets the students and explains the procedures. Students are asked to write on an index card a number from one to five indicating their satisfaction level with the course and a word or phrase to clarify their experience ("awesome," "confusing," etc.). For the report, these data are dropped into a histogram that displays the number of students and lists each number and the associated words or phrases.

The team can then display for students, via a projector or printed copies, a numbered list of the student learning outcomes (SLOs) for the course. On the reverse side of the index card, the students indicate (by recording their numbers) the two SLOs they felt were best met and the two that were least fulfilled. During the final stage of the QCD, students form groups of five to seven and on a highly structured form, they identify the course (or program) strengths and weaknesses using a cooperative brainstorming technique called "roundtable" where students rapidly pass around a sheet of paper, adding ideas as they say them aloud. The groups then rank the top three strengths and the top three weaknesses. These data are recorded onto a single template, group by group, and then analyzed by a person skilled in trend analysis, usually the faculty developer. Common themes are coded with the same color across teams, thus emphasizing the common strengths or issues. For example, if four teams mention "poor textbook" or anything similar (e.g., "textbook sucks"), a reader will see red in all the team ratings, if that is the color selected for "poor textbook."

Title: Instructional Skills Questionnaire (ISQ)

Who has access to the data: Instructor

Delivery method: Online

Description:

The ISQ can be used to provide instructors with immediate and specific feedback concerning their teaching. The ISQ conceptualizes teaching in terms of the seven ISQ dimensions based on Feldman's (2007) categories of teaching behavior. Each dimension was measured by two indicative items and two contra-indicative items with a 7-point Likert scale response format (response options ranging from strongly disagree to strongly agree). The contra-indicative items are recoded prior to analyses.

The seven ISQ dimensions are defined as follows:

  1. Structure: the extent to which the subject matter is handled systematically and in an orderly way
  2. Explication: the extent to which the instructor explains the subject matter, especially the more complex topics
  3. Stimulation: the extent to which the instructor interests students in the subject matter
  4. Validation: the extent to which the instructor stresses the benefits and the relevance of the subject matter for educational goals or future occupation
  5. Instruction: the extent to which the instructor provides instructions about how to study the subject matter
  6. Comprehension: the extent to which the instructor creates opportunities for questions and remarks regarding the subject matter
  7. Activation: the extent to which the instructor encourages students to actively think about the subject matter

These statements are rated on a scale of Strongly Disagree, Disagree, Somewhat Disagree, Do Not Agree Nor Disagree, Somewhat Agree, Agree, and Strongly Agree

For more clarity on this evaluation process, please visit this PLOS Article Abstract

Knol MH, Dolan CV, Mellenbergh GJ, van der Maas HLJ (2016) Measuring the Quality of University Lectures: Development and Validation of the Instructional Skills Questionnaire (ISQ). PLoS ONE 11(2): e0149163. doi:10.1371/journal.pone.0149163

Title: CWSEI Teaching Practices Inventory

Who has access to the data: Instructor

Delivery method: Instructor reflection

Description:

The instructor completes the Teaching Practices Inventory (TPI) and writes a reflection. Students are not directly involved in this process. These questions are specifically geared toward the science and mathematic lecture-based subjects. It takes about 10-15 minutes to complete. The acronym for this method is representative of Carl Wieman Science Education Initiative at the University of British Columbia.

Title: Classroom Observation Protocol for Undergraduate Students (COPUS)

Who has access to the data: Instructor

Delivery method: In Person

Description:

Instructors will observe their classroom in multiple 2-minute increments throughout the session. These observations focus largely on student behavior rather than teacher quality. There are 25 different codes that can be used to describe each observation installment that can then be turned into quantitative data for further and easier assessment. These codes represent themes such as listening, individual thinking, clicker, discussion, worksheet group work, other group work, answer, student, whole class discussion, predicting, student present, test/quiz, waiting, other, lecturing, writing, follow up, pose, moving/guiding, one-on-one, demo+, and admin. There is a recommended 1.5 hour training session to learn how to properly implement this method as an evaluation method.

Title: Reformed Teaching Observation Protocol (RTOP)

Who has access to the data: Instructor

Delivery method: Paper

Description:

This observation method is advocated for use in classrooms that are considered 'reformed' in regards to being student-centered and activity-based rather than teacher-centered. The RTOP looks at 5 main items in a reformed classroom: lesson design and implementation, propositional knowledge, procedural knowledge, student-student interaction, and studentteacher interaction. Each of these items, or subscales, are graded on a scale of 0-4. This procedure was originally created for mathematics and science based courses.

Example Section:

  1. Instructional strategies and activities respected students' prior knowledge and the preconceptions inherent therein.
  2. The lesson was designed to engage students as members of a learning community.
  3. In this lesson, student exploration preceded formal presentation.
  4. This lesson encouraged students to seek and value alternative modes of investigation or of problem solving.
  5. The focus and direction of the lesson was often determined by ideas originating with students.

Students rate using the following scale: 0 - Never Occurred, 1, 2, 3, 4 - Very Descriptive.

Title: Teaching Dimensions Observation Protocol (TDOP)

Who has access to the data: Instructor and Observers

Delivery method: Paper

Description:

This method is used to examine the dynamics that occur between students, instructors, and technologies within the classroom. This method gives descriptions of teaching rather than giving the judgment of quality of teaching. The TDOP has 6 main areas that it focuses on: instructional practices, student-teacher dialogue, instructional technology, potential student cognitive engagement,* pedagogical strategies,* and students' time on task.* The 3 listed above with an asterisk (*) are considered optional to those using this evaluation tool. To use this method, one needs to select who is going to be observing the class (usually more than 1 person), select which of the 6 you want them to focus on, use the codes on the template (a sample is given below), participate in a training to read/use results, conduct observations, then analyze and interpret data.