Parents Allege Bias in Proctoring Technology

By Vaughn Cockayne

When Rafiq Kalam Id-Din II realized the extent of the use of proctoring technology at universities that he considered racially biased and invasive, he knew — as an educator, parent and person of color — he could no longer stay silent.

“This abrupt move to online caught everyone off guard,” Kalam Id-Din, who lives in Brooklyn, N.Y., told Digital Privacy News. “Because in the pursuit of efficiency, people gave up true efficacy and empathy.”

Kalam Id-Din is the founder and managing partner of the Ember Charter Schools for Mindful Education, Innovation and Transformation in Brooklyn.

He is among 2,000 parents who joined recently with Fight for the Future, a Boston nonprofit digital-rights activist group, to call on McGraw Hill to end its relationship with the software maker, Proctorio.

In an open letter last month to McGraw Hill, an educational publisher that provides a wide range of customized content, software and services, to also end its dealings with “all other invasive and racially-biased online proctoring tools permanently.”

Proctorio uses artificial intelligence to conduct gaze detection, which tracks whether students are looking away from their screens during tests. It operates through a browser extension on Google Chrome.

“If the algorithm said the student was cheating — then the student was cheating.”

Lia Holland, Fight for the Future.

The program’s goal is to reduce cheating during exams by recording student movements and their computer screens while taking tests. If Proctorio “flags” students, they are kicked off exams and must gain permission to reenter.

McGraw Hill did not respond to requests for comment from Digital Privacy News. Proctorio officials also did not return requests for comment.

Proctorio monitors millions of tests per year and added more than 1,000 schools to their watchlist last year, according to news reports quoting CEO Mike Olson.

COVID Increases Use

Proctoring technology, or e-proctoring, took on widespread use last year during the COVID-19 pandemic.

But news reports last year disclosed that students worldwide objected to commercial e-proctoring services. In September, Proctorio sued an employee at the University of British Columbia, alleging breach of copyright regarding faculty training videos.

The worker hit back by suing Proctorio in British Columbia Supreme Court and attacking the company on social media and linking to materials that were the subject of litigation.

In addition, students in the U.K. complained that they were forced to wear adult diapers or to urinate in bottles during tests to prevent Proctorio software from flagging them or terminating their assessments.

Disillusioned test-takers have taken to Twitter to attack the company with its own hashtag, “@Procteario.”

“This has been a very hasty rollout of a very experimental and half-baked technology to a population that is not prepared to speak out with the risk of compromising their education,” Lia Holland, an activist for Fight for the Future, told Digital Privacy News.

Widespread Criticism

The use of video surveillance technology, especially facial-recognition software, remains under fire — with critics also citing alleged bias in proctoring technology. Proctorio often has been criticized by people of color, arguing the software does not recognize them as they take tests. 

“There’s this perception that algorithms are correct, that they are superhuman,” Holland said. “When, in reality, they are built by humans — with all our shortcomings and biases embedded in the software.

“So, if the algorithm said the student was cheating — then the student was cheating.”

In this way, the fight against Proctorio is not only a fight against the normalization of surveillance, but a fight against educational inequality as well, activists told Digital Privacy News. 

“Even if there was a world where all the racism and glitches are fixed, Proctorio is not going to stop at test-taking,” Holland said.

“Their plan is to normalize surveillance for all coursework — to film yourself  doing your homework or anytime you are engaged in an educational experience.”

“The danger in moving to AI is that it will obscure racism and racially biased perspectives under this guise of something neutral like technology.”

Rafiq Kalam Id-Din II, educator, Brooklyn, N.Y.

Parents and educators further are worried that the move to online education and the normalization of e-proctoring could spur further automation issues.

“What we are really losing here is accountability,” Kalam Id-Din said. “What are you gonna do? Accuse the computer of being racist?

“The danger in moving to AI is that it will obscure racism and racially biased perspectives under this guise of something neutral like technology.

“People can hold people accountable,” he said.

Growing Divide

Educators and school administrators also have grown more disconnected amid the battle over using e-proctoring technology.

“How many teachers did they talk to that said this was a good idea?” Kalam Id-Din asked. “I suspect nobody who serves the students that I serve — low-income students, Black and brown students in places like Brooklyn — nobody asked us.

“To say that there is a misunderstanding between us is generous,” he observed. “That would assume that they sought to understand.”

Vaughn Cockayne is a Washington writer.