British Grading Debacle Shows Pitfalls of Automating Government

British Grading Debacle Shows Pitfalls of Automating Government

The outcome, experts say, was entirely predictable. In fact, the Royal Statistical Society had for months warned the test administration agency, Ofqual, that the model was flawed.

“It’s government trying to emulate Silicon Valley,” said Christiaan van Veen, director of the digital welfare state and human rights project at New York University. “But the public sector is completely different from private companies.”

As an investigator for the United Nations, Mr. van Veen studies how Britain and other countries use computers to automate social services. He said the techniques were being applied to policing and court sentencing, health care, immigration, social welfare and more. “There are no areas of government that are exempt from this trend,” he said.

Britain has been particularly aggressive in adopting new technology in government, often with mixed results. Earlier this month, the government said it would stop using an algorithm for weighing visa applications after facing a legal complaint that this was discriminatory. A few days later, a British court ruled against the use of some facial-recognition software by the police.

The country’s automated welfare system, Universal Credit, has faced years of criticism, including from the United Nations, for making it harder for some citizens to obtain unemployment benefits. Britain’s contact-tracing app, which the government had said would be key to containing the coronavirus, has been delayed by technical problems.

“There is an idea that if it has an algorithm attached to it, it’s novel and interesting and different and innovative, without understanding what those things could be doing,” said Rachel Coldicutt, a technology policy expert in London who is working on a book about responsible innovation.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *