Online Proctoring: An Insider's View

david_smetters_profile_picAn interview with David Smetters, CEO of Respondus

An excerpt of this interview originally published by  University Business, Feb. 2017.

Online proctoring has taken off over the last 5 years. What are the main issues it attempts to solve?
It's really three things. First, and foremost, it's trying to prevent students from cheating during an assessment. Second, it's about protecting the exam content, so students can't copy and share exam questions with others. And third, it helps educators get a true snapshot of where students are in their learning.

By snapshot you mean it provides a way to grade students on their work?
Sure, but it's more than that. For example, with adaptive learning, it's important that a student master a concept or skill before moving to the next level… The other day I was talking with an administrator of a large distance learning program. As part of their onboarding process, students take an online assessment to determine placement for their math requirement. The problem is that a large number of students cheat during this assessment, which currently uses no proctoring component.  These students end up getting placed in too high of a level of math course. And, then, as you might expect, they perform poorly in those classes, sometimes dropping them entirely. So, in this case, online proctoring can help ensure that a student gets into the right level of math. It allows the student to be successful.

What percentage of students cheat during online exams?
The research varies on this point because there are different factors that contribute to cheating. For example, a student is more likely to cheat as the importance of the exam increases. And they are less likely to cheat on lower stake exams. Social factors also contribute to cheating, such as whether a student is aware that other students in the class are cheating and getting away with it… But broadly speaking, it's estimated that about half of students will attempt to cheat during online assessments if there's no proctoring or similar safeguard in place.

And what percentage of students attempt to cheat when the online exam is being proctored?
Again, various things affect this, but most estimates put it in the 3 to 5 percent range.

That's a significant difference, dropping from 50% to 5%...
It really is. And it speaks to the deterrent effect that proctoring provides. It doesn't matter if the student is proctored in a classroom or online, the result is about the same. We consistently hear that when an online exam is administered without any type of proctoring, the average class score is about 10-15 percentage points higher than when a proctoring component is added. More importantly, the scores for online exams that have proctoring are similar to the scores for exams taken in a classroom. That's what accrediting agencies want to see, and administrators are starting to understand that.

There are still a lot of online exams taken on campus, right? It's not all occurring off campus.
Oh, sure.  A significant number of [online] exams are taken on campus -- in classrooms, in testing centers. And when you move down to the K-12 level, nearly all online exams are delivered within the school building.

Are online proctoring services needed when exams are taken on campus?
At Respondus, we divide online testing into two segments: online exams that are delivered in proctored environments such as testing centers and classrooms, and, secondly, online exams taken in non-proctored environments, such as from a student's home. For proctored environments we offer LockDown Browser, which locks down the computer or device that a student uses to take an exam. Students cannot go to a different URL, access other applications, print, copy text, open a new tab in the browser to search for answers, and so on.

If [LockDown Browser] is used in proctored environments, the proctor doesn't have to worry about the student cheating on the computer they're using. Is that it?
Right. The proctor mainly has to ensure that students aren't accessing a second device, or looking at another student's screen, using a cheat sheet, things like that…

And for exams delivered in non-proctored environments, we offer Respondus Monitor. Respondus Monitor uses the LockDown Browser technology as a starting point, but additionally has students record themselves with a webcam during the exam. The recordings are then available to the instructor, along with the automated flagging of events and other data.

So Respondus Monitor is entirely automated?
Right. It integrates seamlessly with the LMS's assessment engine. If the exam settings require students to use Respondus Monitor, it guides them through the process of using the webcam. And for the instructors, once the exam session is complete, everything is available to them from within the LMS --  the videos, the flagging, information about the exam session, all of that.

This differs from other online proctoring services where an employee is watching the student with a webcam during the exam. Right?
Yes, that's a different business model. The live proctoring services use humans to do the work, whereas we automate everything with technology. It's like travel agents versus Expedia. They each have their place.

Are there situations where live proctoring is more appropriate than an automated proctoring system, and vice versa?
Sure. If you have a high-stake certification exam and a student pulls out a camera to steal the exam questions, you'dThe majority of exams in higher ed don't require the immediacy of live proctoring. want the ability to shut that exam down immediately. In that scenario a live proctoring solution would be the better choice because it can be extremely costly to replace questions for a high stake certification exam.

Our solution, Respondus Monitor, is fully automated and is intended for the university environment. The majority of exams in higher ed don't require the immediacy of live proctoring. If a student leaves the computer in the middle of the exam, an instructor is usually fine with learning about that after the exam is complete.

I assume there's a price difference between live proctoring and automated proctoring.
Yes. Live proctoring generally runs $20-35 per exam. It doesn't scale from a cost standpoint, which is why you don't see wide adoption of this model across a campus. It's usually a handful of instructors or courses at an institution that use it.

How is your automated proctoring system priced?
We have transparent pricing, so it's actually listed on our website [www.respondus.com]. Respondus Monitor is roughly $4 per user for the first 1000 seats, then $2 per seat thereafter. A seat is defined as one student per course. There are no per exam fees, which means that the cost is the same if the instructor uses it three times during the course or 15 times… The average instructor uses Respondus Monitor about 6.5 times per course, so if you calculate it on a per exam basis, it works out to about 30 cents an exam.

That's more of a price difference than I would have guessed. So 30 cents [per exam] for a fully automated proctoring solution versus $25 for live proctoring?
Yeah, it's kind of crazy when you do the math. Automated proctoring can be 50 times less expensive than live proctoring. And as I mentioned earlier, any good proctoring system reduces attempted cheating to about the 3 to 5% range. So the institution needs to decide what it's willing to pay to reduce that rate to, say, 2% or 1%. For some situations, it may be worth 50 times the cost. But for the large majority of higher education exams, it's generally not worth the cost.

Does it bother you that even 3 to 5% of students might be cheating their way through college?
I don't think that's a proper way to look at it. First of all, there are many disciplines where one or more high-stake exams are required before you get the certification. Nursing, accounting, law, and so on. Entrance exams for graduate studies also prevent students from progressing if they don't know their stuff.  High-stake exams are generally conducted in-person, or at a commercial testing center. It's very unlikely that a student can cheat in those settings…

There are two reasons that Respondus Monitor is such a strong deterrent to cheating. First, a student knows they might get kicked out of the university if they are caught cheating. It's a pretty stiff penalty.  Secondly, students know they can get caught long after the exam was actually administered. If a student is suspected of cheating on the third exam of a course, the instructor can go back to the first two exams to see if similar behavior occurred. And if the institution gets involved, they can examine testing sessions from other courses, even those that a student took a year earlier… It's similar to how athletes can get caught using performance enhancing drugs many years after the fact.

How long does Respondus store the videos from the exam sessions?
Up to five years. That's the default. We store the videos themselves, the flagging information, and other data about the exam session. The retention policy is set by the institution, so it might be shorter at some universities.

Speaking of 5 years, where do you think online exam proctoring will be in that timeframe?
It's hard to make five year predictions with technology. But things move slowly in higher education, so I'll bite. [Laughs.]  I don't know if it will be 5 years or 10 years, but there's little doubt in my mind that automated proctoring will be ubiquitous on campuses at some point, much like anti-plagiarism software is today.

Like TurnitIn?
Right. The price point for automated online proctoring is already low enough to enable widespread adoption. I mentioned earlier that we have the cost down to about 30 cents an exam session [for Respondus Monitor], but we hope to push that even lower over time. One of the things holding back widespread adoption [of automated proctoring] at universities is budget politics. Proctoring technology is generally seen as a distance learning or online learning expense. Sometimes indinterview-quote-2bividual departments will pick up the cost, such as a nursing or business school. But at 30 cents an exam, an institution actually saves money compared to a paper-based exam delivered on campus… The Scantron sheet might be 15 cents, the printed exam might be another 30 cents, there are costs for scanning the answer sheets, storing the printed materials after the exam, costs for the classroom where the exam is delivered, proctoring costs, and so on. The total cost [for paper-based exams] is much more than 30 cents…. Testing centers aren't the answer either because they are quite expensive to set up and run.

How much of a cost savings are we talking about?
The savings [for automated proctoring ] is very small for a class of 25 students. It might save only tens of dollars. But the savings can be hundreds of dollars for larger courses, and tens of thousands of dollars across departments. And when you look at the cost savings across the entire university, it can be hundreds of thousands of dollars annually -- even more if certain overhead costs are included. But as long as the expense for online proctoring is borne by the distance learning or online learning groups, they will ration the service. And without the wide availability of an online proctoring solution available to instructors, they won't use online testing. They'll be too concerned about cheating.

You mention rationing. Do you mean that universities will limit instructor use of online proctoring?
Yes, that's exactly what happens today. They ration it because the cost is coming from their budget. And on the flip side, the cost savings from paper, Scantron sheets, and so on, often doesn't get credited to them… Once administrators see that fully-automated proctoring solutions like Respondus Monitor actually reduce testing costs across campus, I think the adoption rate will take off.

Are you seeing signs of that yet?
Only hints of it. I was talking recently with a professor at a large university, and she said her department strongly encourages the use of online testing if a class has more than 80 students. It's openly stated as a cost-savings measure. So if this type of directive occurs more broadly across campuses, there will be greater adoption of online proctoring.

What is the market potential for online proctoring in higher education. I've seen numbers in the hundreds of millions [of dollars]?
It's difficult to offer market projections because live proctoring is currently cherry picking courses where proctoring is an absolute requirement, such as health sciences and business schools. There's less price sensitivity within this group. But you can't extrapolate from those numbers because other departments don't have the same proctoring requirements, and they won't pay $25 per exam.… Live proctoring is simply too expensive for wide-spread adoption across a campus. If a university has 10,000 courses, the cost to provide live proctoring across those courses would be millions of dollars annually… We believe universities will opt for a fully automated solution, and only use live proctoring for unique situations. And we think the average cost to a university for automated proctoring will be below $20,000 annually. That puts the market potential in the $50 million range. And even that will take many years to achieve.

Can you provide a sense of how widely your system is being used at educational institutions? I'm not after revenue numbers, more about how many institutions are using it, or how many students use it for exams…
We don't share too much detail on that for competitive reasons, but over 1,000 academic institutions have an enterprise-wide license for LockDown Browser. About one-third of those use Respondus Monitor [the automated proctoring system] with it…  In connection with the LMS market, about 35 million LockDown Browser sessions will occur this year. If you include the publishing companies that license [LockDown Browser] for their homework and assessment systems, that number is in the 50 million range… The percentage of LockDown Browser sessions that additionally use Respondus Monitor is still relatively small, but it's the fastest growing segment for us… These might sound like decent usage numbers, but when you realize that a large university conducts over one million assessments a year, you get a sense of how young this market is.

If [LockDown Browser] is used in proctored environments, the proctor doesn’t have to worry about the student cheating on the computer they’re using. Is that it?
Right. The proctor mainly has to ensure that students aren’t accessing a second device, or looking at another student’s screen, using a cheat sheet, things like that…

And for exams delivered in non-proctored environments, we offer Respondus Monitor. Respondus Monitor uses the LockDown Browser technology as a starting point, but additionally has students record themselves with a webcam during the exam. The recordings are then available to the instructor, along with the automated flagging of events and other data.

So Respondus Monitor is entirely automated?
Right. It integrates seamlessly with the LMS’s assessment engine. If the exam settings require students to use Respondus Monitor, it guides them through the process of using the webcam. And for the instructors, once the exam session is complete, everything is available to them from within the LMS — the videos, the flagging, information about the exam session, all of that.

This differs from other online proctoring services where an employee is watching the student with a webcam during the exam. Right?
Yes, that’s a different business model. The live proctoring services use humans to do the work, whereas we automate everything with technology. It’s like travel agents versus Expedia. They each have their place.

Are there situations where live proctoring is more appropriate than an automated proctoring system, and vice versa?
Sure. If you have a high-stake certification exam and a student pulls out a camera to steal the exam questions, you’dThe majority of exams in higher ed don't require the immediacy of live proctoring. want the ability to shut that exam down immediately. In that scenario a live proctoring solution would be the better choice because it can be extremely costly to replace questions for a high stake certification exam.

Our solution, Respondus Monitor, is fully automated and is intended for the university environment. The majority of exams in higher ed don’t require the immediacy of live proctoring. If a student leaves the computer in the middle of the exam, an instructor is usually fine with learning about that after the exam is complete.

I assume there’s a price difference between live proctoring and automated proctoring.
Yes. Live proctoring generally runs $20-35 per exam. It doesn’t scale from a cost standpoint, which is why you don’t see wide adoption of this model across a campus. It’s usually a handful of instructors or courses at an institution that use it.

How is your automated proctoring system priced?
We have transparent pricing, so it’s actually listed on our website [www.respondus.com]. Respondus Monitor is roughly $4 per user for the first 1000 seats, then $2 per seat thereafter. A seat is defined as one student per course. There are no per exam fees, which means that the cost is the same if the instructor uses it three times during the course or 15 times… The average instructor uses Respondus Monitor about 6.5 times per course, so if you calculate it on a per exam basis, it works out to about 30 cents an exam.

That’s more of a price difference than I would have guessed. So 30 cents [per exam] for a fully automated proctoring solution versus $25 for live proctoring?
Yeah, it’s kind of crazy when you do the math. Automated proctoring can be 50 times less expensive than live proctoring. And as I mentioned earlier, any good proctoring system reduces attempted cheating to about the 3 to 5% range. So the institution needs to decide what it’s willing to pay to reduce that rate to, say, 2% or 1%. For some situations, it may be worth 50 times the cost. But for the large majority of higher education exams, it’s generally not worth the cost.

Does it bother you that even 3 to 5% of students might be cheating their way through college?
I don’t think that’s a proper way to look at it. First of all, there are many disciplines where one or more high-stake exams are required before you get the certification. Nursing, accounting, law, and so on. Entrance exams for graduate studies also prevent students from progressing if they don’t know their stuff. High-stake exams are generally conducted in-person, or at a commercial testing center. It’s very unlikely that a student can cheat in those settings…

There are two reasons that Respondus Monitor is such a strong deterrent to cheating. First, a student knows they might get kicked out of the university if they are caught cheating. It’s a pretty stiff penalty. Secondly, students know they can get caught long after the exam was actually administered. If a student is suspected of cheating on the third exam of a course, the instructor can go back to the first two exams to see if similar behavior occurred. And if the institution gets involved, they can examine testing sessions from other courses, even those that a student took a year earlier… It’s similar to how athletes can get caught using performance enhancing drugs many years after the fact.

How long does Respondus store the videos from the exam sessions?
Up to five years. That’s the default. We store the videos themselves, the flagging information, and other data about the exam session. The retention policy is set by the institution, so it might be shorter at some universities.

Speaking of 5 years, where do you think online exam proctoring will be in that timeframe?
It’s hard to make five year predictions with technology. But things move slowly in higher education, so I’ll bite. [Laughs.] I don’t know if it will be 5 years or 10 years, but there’s little doubt in my mind that automated proctoring will be ubiquitous on campuses at some point, much like anti-plagiarism software is today.

Like TurnitIn?
Right. The price point for automated online proctoring is already low enough to enable widespread adoption. I mentioned earlier that we have the cost down to about 30 cents an exam session [for Respondus Monitor], but we hope to push that even lower over time. One of the things holding back widespread adoption [of automated proctoring] at universities is budget politics. Proctoring technology is generally seen as a distance learning or online learning expense. Sometimes indinterview-quote-2bividual departments will pick up the cost, such as a nursing or business school. But at 30 cents an exam, an institution actually saves money compared to a paper-based exam delivered on campus… The Scantron sheet might be 15 cents, the printed exam might be another 30 cents, there are costs for scanning the answer sheets, storing the printed materials after the exam, costs for the classroom where the exam is delivered, proctoring costs, and so on. The total cost [for paper-based exams] is much more than 30 cents…. Testing centers aren’t the answer either because they are quite expensive to set up and run.

How much of a cost savings are we talking about?
The savings [for automated proctoring ] is very small for a class of 25 students. It might save only tens of dollars. But the savings can be hundreds of dollars for larger courses, and tens of thousands of dollars across departments. And when you look at the cost savings across the entire university, it can be hundreds of thousands of dollars annually — even more if certain overhead costs are included. But as long as the expense for online proctoring is borne by the distance learning or online learning groups, they will ration the service. And without the wide availability of an online proctoring solution available to instructors, they won’t use online testing. They’ll be too concerned about cheating.

You mention rationing. Do you mean that universities will limit instructor use of online proctoring?
Yes, that’s exactly what happens today. They ration it because the cost is coming from their budget. And on the flip side, the cost savings from paper, Scantron sheets, and so on, often doesn’t get credited to them… Once administrators see that fully-automated proctoring solutions like Respondus Monitor actually reduce testing costs across campus, I think the adoption rate will take off.

Are you seeing signs of that yet?
Only hints of it. I was talking recently with a professor at a large university, and she said her department strongly encourages the use of online testing if a class has more than 80 students. It’s openly stated as a cost-savings measure. So if this type of directive occurs more broadly across campuses, there will be greater adoption of online proctoring.

What is the market potential for online proctoring in higher education. I’ve seen numbers in the hundreds of millions [of dollars]?
It’s difficult to offer market projections because live proctoring is currently cherry picking courses where proctoring is an absolute requirement, such as health sciences and business schools. There’s less price sensitivity within this group. But you can’t extrapolate from those numbers because other departments don’t have the same proctoring requirements, and they won’t pay $25 per exam.… Live proctoring is simply too expensive for wide-spread adoption across a campus. If a university has 10,000 courses, the cost to provide live proctoring across those courses would be millions of dollars annually… We believe universities will opt for a fully automated solution, and only use live proctoring for unique situations. And we think the average cost to a university for automated proctoring will be below $20,000 annually. That puts the market potential in the $50 million range. And even that will take many years to achieve.

Can you provide a sense of how widely your system is being used at educational institutions? I’m not after revenue numbers, more about how many institutions are using it, or how many students use it for exams…
We don’t share too much detail on that for competitive reasons, but over 1,000 academic institutions have an enterprise-wide license for LockDown Browser. About one-third of those use Respondus Monitor [the automated proctoring system] with it… In connection with the LMS market, about 35 million LockDown Browser sessions will occur this year. If you include the publishing companies that license [LockDown Browser] for their homework and assessment systems, that number is in the 50 million range… The percentage of LockDown Browser sessions that additionally use Respondus Monitor is still relatively small, but it’s the fastest growing segment for us… These might sound like decent usage numbers, but when you realize that a large university conducts over one million assessments a year, you get a sense of how young this market is.