When the LMS goes down, who ends up holding the ball? Faculty

Student assignments are due at 8 AM Friday–but there’s an outage at 2 AM and the system’s down. Students show up for class and tell the professor they couldn’t turn in their assignments. Who ends up holding the ball?

Well, once upon a time I taught a one-credit class and I’ve experienced this situation first-hand. Learning Management Systems, A/V systems, University-provided computers, and more are all intended to support faculty teaching. (See also the TPACK model for how technology, pedagogy, and content knowledge combine to create a learning environment.) But when these technology tools fail, the faculty are the ones standing in front of the students. The faculty have to jury-rig another solution, or show students what to do.

Let’s connect that to ITIL’s concepts of “warranty” and “utility.” Typically, customers focus on utility: what will this service do? Can the LMS integrate with Turnitin? Can it randomize quiz questions?

In fact, IT departments are normally the ones who worry about warranty. Warranty is how reliable a service is. I’ve taken to calling it “service quality,” because that term is more accessible to non-ITIL people. Customers barely mention service quality–they assume services will be entirely reliable. Customers don’t tend to understand risks the way that IT understands risk. It can be challenging for IT to convey the value of a high-quality service.

So here’s an opportunity unique to higher education to highlight service quality:

  1. If this service is not reliable enough, faculty will be in front of their classroom and forced to provide IT support.
  2. No one at any institution wants faculty to be doing IT support.

When IT can provide academic IT services at a high enough quality that professors don’t have to run interference, professors can focus on what they were hired to do: teach, maybe research, maybe outreach, and connect with students.

This approach can greatly help inform IT service design. When designing a service, ask what questions faculty may receive from students about the service. Then look for ways to make the system more reliable, more intuitive, and/or better explained (in that order). As examples:

  • New LMS upgrade causes quizzing to break when Flash is updated. In turn, students will ask faculty what to do when they can’t take their quiz. How can this be mitigated so quizzing no longer breaks when Flash is updated? (Improved reliability)
  • New LMS upgrade hides the “quizzing” link in a place most students wouldn’t look, resulting in faculty having to add to their syllabus where to go to click the quiz button. Can we make the link easier to find instead? (More intuitive)
  • New LMS upgrade hides the “submit your answers” button until all answers are filled out. Students may not know what to do if they don’t have an answer to a question, and then ask faculty. If we can’t change the functionality, can we at least add a note that says “you must at least guess for every question before submitting your answers?” (Better explanation)

The great thing about this mode of thinking is you can show exactly the cost of NOT doing these things. Often it’s hard to show the value for service quality. But faculty time is considered a precious commodity. I can’t think of any non-academic administrator who could reasonably trump a faculty concern. It may justifiably be worth $10,000 to mitigate service quality issues if it means your Nobel Prize winning faculty member therefore doesn’t have to answer any student questions about how the LMS works.

The faculty voice is often missing in University IT governance–there may be a faculty IT committee but in practice they aren’t asking for new things the way administration asks for new things. But if you can show faculty the value of quality, and present opportunities for their advocacy helping IT get the time it needs to design services well, faculty can be fierce supporters.