Type to search

“The Case of the Unevaluated Online Courses”*

Assessment Promotion, Tenure, and Evaluation

“The Case of the Unevaluated Online Courses”*

The Case of the Unevaluated Online Courses

The story you are about to hear is true. Only the names have been changed to protect the innocent.

This is the city. I work here. I’m a faculty developer. My name is Thursday, Joe Thursday.

It was a Friday. It was raining. I was writing up reports when the provost, Julie Wednesday, came into my office. She looked agitated. She started asking questions.

“What’s this about five of our faculty members not having evaluations for their promotion and tenure reviews? Don’t we require that these reviews take place?”

I responded, “All we know are the facts, ma’am,” and I told her what I knew.

The five faculty members in question were the most innovative on our campus. They had flipped their classrooms, adopted universal design for learning, and now were teaching fully online to reach our adult students with family and work commitments. Their students loved being able to work on courses when they could fit them into their hectic schedules. That’s life in the big city.

The five faculty members were putting together their tenure portfolios, with the usual lineup of artifacts: publications, committee work, and service letters from colleagues.

So, what was wrong? A lack of credible witnesses to their online teaching.

I talked to some students of the five faculty members, but they couldn’t really evaluate online teaching. Sure, the students rated how much interaction took place, and they said whether they were satisfied with the instructors’ communication skills, but they weren’t yet experts in the field. I took their statements: the five faculty members had plenty of student-ratings data. But data only from students wouldn’t hold up under cross-examination.

I had to get to the people who were really responsible and find out why these online courses hadn’t been evaluated. Could it be coincidence, or something more sinister? The process for evaluating teaching was tried and, maybe, true. A department chairperson would sit in the back of a classroom for an hour and then evaluate what he or she had witnessed. In this case, though, the trail had gone cold.

I went to see the department chair, Mickey Tuesday, to get some answers. We go way back, to our service together at a community college.

I walked into Mickey’s office and closed the door. “I’ll lay it right on the line, Mickey,” I said. “There are five faculty members whose online courses haven’t been evaluated. What’s the story?”

Mickey leaned back in his chair, smiled quietly, and said, “You know, Thursday, it’s a simple case. I’ve never taught an online course myself. I know I’m supposed to observe everybody teaching, but I wouldn’t even know what I was looking at in an online course. So I observed those faculty members’ face-to-face courses instead. Open and shut, right?”

Maybe Mickey was right. Maybe it was that simple. I returned to my office. On Monday, I told Wednesday what Tuesday had told me on Friday. After she had heard the story, the provost said, “Thursday, can’t we just show people what to look for in good online teaching?”

She was right. Many campus leaders have never taught online. I could foresee the day when this would be different, but for now, the provost had a point. Department chairs and deans might not have taught online, but they grew up on instant orange juice. Flip a dial—instant entertainment. Press seven digits—instant communication. Turn a key and push a pedal—instant transportation. Flash a card—instant money. Shove in a problem and push a few buttons—instant answers. By establishing a few facts, we can make expert witnesses out of any colleague who observes online teaching.

Fact 1. Know what is admissible as evidence

Many face-to-face teaching practices may not be “teaching behaviors” online. In face-to-face courses, lecturing is a teaching practice. Lecture notes would not be considered in an observation of online teaching—especially if the person who developed the materials is not the person teaching the course. Videos, podcasts, and the like are also course materials and do not “count” as observable teaching behaviors.

However, if an instructor responds to student questions by posting a mini lecture or video to explain a concept, that “counts” as an observed teaching behavior—the content is created or shared as a result of interaction between learners and the instructor. The criterion to apply is one of information presentation versus interaction. Identify elements of online courses

  • that are always counted as teaching practices (e.g., discussion forums, group-work areas, and feedback on student assignments);
  • that may be counted as teaching practices, depending on structure and interactivity (e.g., supplemental materials, spontaneous “mini lectures,” news/announcement items); and
  • that are never counted as teaching practices (e.g., pre-constructed lecture content, graded tests/quizzes, major course assignments, links to websites, and content created by third parties such as textbook publishers).

Fact 2. Determine the communication between observer and observed

For online courses, an observer must notify the instructor that observation will take place. The instructor may communicate ahead of time about where the observer may wish to focus attention or about anything unique regarding the context of the instruction, especially if there are interactive elements in the online course environment that go beyond the usual places where interaction occurs.

Communication, in the form of clarifying and directional questions, is often beneficial during the online observation period. For example, an observer may want to see supplemental content that is released to students only after they accomplish various course tasks (and that the observer is unable to unlock).

Fact 3. Define who can help an observer

Observers of online courses may not be skilled at navigating the environment or may need technical help in observing online-course elements. Determine where technical assistants should come from (e.g., teaching and learning center staff). Assistants must draw a “bright line” about being able to answer process-related questions, leaving the domain of “what to observe” squarely in the hands of the administrative observers. Define the role of assistants, too. The continuum ranges from

  • fully embedded (the assistant is at the keyboard all the time) to 
  • consultative (the observer is at the computer, and the assistant offers verbal help) to 
  • on call (the assistant is not initially involved and is brought in only by request).

After all these facts came to light, I visited Provost Wednesday on Monday. It can be awkward having a visit from a faculty developer. When I stopped by last week unannounced, the temperature dropped 20 degrees. This was a much warmer conversation. With a little help, Mickey observed and evaluated those five faculty members’ online courses, just in time for their promotion packets to be submitted. We had the evidence we needed. Case closed.

*With apologies to the producers and writers of the Dragnet television series.

Tom Tobin is a researcher, author, and speaker on issues related to quality in higher education. He has been designing and teaching online courses for 20 years, and he consults and publishes on academic integrity, accessibility, copyright, and administrative evaluation of online teaching. His latest book, Evaluating Online Teaching, was published by Jossey-Bass in June 2015.

Tom Tobin will deliver a Magna Online Seminar, “The Academic Leader’s Toolkit for Evaluating Online Teaching,” on December 2, 2015. Register at www.magnapubs.com.

Leave a Comment