Incorporating Student Feedback into Course (Re)Design

Student ratings of teaching are a contested ritual, particularly as they seem tainted by gender bias and low response rates, as reported in a recent column in the Chronicle of Higher Education. Even as this may be the case, the narrative feedback provided by those students who do complete course evaluations may offer some useful points of departure for reflection or revision.

Recently, in my capacity as an instructional design consultant at my previous institution, I met with an instructor who had had moved his flipped classroom into an entirely online course offering. Having already recorded many of his course lectures to post as viewing material in advance of in-class discussions (and reported experiencing success with this flipped instruction model), he had been thrilled to be able to make use of them, once again, for his online version of the course. Come the end of the academic term, however, he shared this feedback, which he received from several students on his course evaluations:

More activities, fewer lectures.

After the success of these lectures in his flipped classroom, he was disappointed by this critique. Fortunately, we were able to work from this feedback to take actionable steps toward improving delivery of his online content.

Here are three steps to incorporate feedback from student ratings of teaching into the online components of a course design:

  1. Do not take negative criticism personally. This isn’t so much a step as it is a suggested frame of mind for getting started. It is also one of the common ways that faculty who were recognized for excellence in teaching interpret and respond to their own student ratings of teaching.
  2. Unpack the feedback. This involves reading the narrative responses and taking stock of what students are saying. According to them, what worked? What didn’t work? Run unconstructive criticism through a mental filter that looks at critical statements from a new angle. For example, a common point of feedback for courses with online components includes some variation on this statement: “The class seemed disorganized.” Instead of rejecting this feedback as unhelpful, think about some of the ways that organization happens online. Organization has to do with the way that information is labeled and located. It also has to do with the way students are meant to navigate a course. If modules are in use, they should follow a consistent pathway with a clear beginning, middle, and end. Whatever the logic of organization, it should be communicated to students, even if it seems self-evident to the instructor.

Another part of unpacking feedback is looking for patterns. Are there multiple references to aspects of course organization, for example? If so, consider revisiting the course with an eye toward updating the organization of important information, learning modules, and resources. Our learning management systems are robust with options for housing pages, modules, grades, and other resources, but it is easy to get lost in both the vocabulary of learning management systems and the many navigation options in an open menu bar. If multiple student responses refer to difficulties locating materials or instructions, this is a clue that the course could benefit from better organization.

By unpacking the feedback, responses like, “The class seemed disorganized,” can be reframed to something more helpful, like this: Information needs to be clearly labeled, logically located, consistently organized, and clarified to students.

  1. Make a plan for revision. As reflective practitioners, we are always looking for ways to enhance the elements of a course that are going well and to make improvements when we identify challenges. At CSUCI, there are great resources for instructors seeking to do either or both of those things.
  • TLI’s Tool Buffet is organized by rhetorical objectives, like humanizing the online learning experience, communicating and interacting, and creating well-designed content.
  • Course Review is a great way to get detailed feedback from a supportive faculty colleague who will visit your course and evaluate it using the QLT Core 24 rubric, offering a check-list with instructions for implementing suggested updates to the course.
  • Schedule a consultation with TLI support staff, who will happily work with you directly to help you meet your instructional design goals.

These are just a few options for getting started, and the process was one that helped my faculty colleague address his instructional design goals.

Here is how the process worked for my faculty colleague. First, recall the SRT feedback: More activities, fewer lectures.

  1. Do not take the criticism personally. The evaluation form asked what students would like to see improved in the course.
  2. Unpack the feedback. The students might engage with the same content if they encountered multiple modes of representation, one of the pillars of UDL. This is something he was able to facilitate during the physical meeting time of his face-to-face course but that seemed to be missing from the fully online one.
  3. Make a plan for revision. We worked with the instructor to locate alternative content-delivery options (there are many possibilities with OERs!) and to leverage some of the tools and resources in the learning management system to provide students with more opportunities to engage with the content, the instructor, and each other!

Teaching online for the first time can feel similar to being a first-year teacher because the experience (for students and instructors, alike) is very different. While student ratings of teaching are only one, limited access point for reflecting on the needs of our students, reading them constructively can elicit actionable ways to make continuous improvement in our online courses.

Leave a Reply

Your email address will not be published. Required fields are marked *