mastodon.me.uk is one of the many independent Mastodon servers you can use to participate in the fediverse.
Open, user-supported, corporation-free social media for the UK.

Administered by:

Server stats:

499
active users

From the article: “There’s no AI [written decisions] that are going out without having human interaction and that human review,” Sewell said. “We can get decisions out quicker so that it actually helps the claimant.”
You absolutely know that this will turn into "Computer says no." we've seen time and time again that people just accept the system outputs. See Horizon, Robodebt, other scandals for details. This will just be the next one.

@CatherineFlick OK there is a human review but ...

"Any lack of accuracy concerns the lawyers with Nevada Legal Services. If the AI appeals system generates a hallucination that influences a referee’s decision, it not only means the decision could be wrong it could also undermine the claimant’s ability to appeal that wrong decision in a civil court case.

“In cases that involve questions of fact, the district court cannot substitute its own judgment for the judgment of the appeal referee,” said Elizabeth Carmona, a senior attorney with Nevada Legal Services, so if a referee makes a decision based on a hallucinated fact, a court may not be able to overturn it.

In a system where a generative AI model issues recommendations that are then reviewed and edited by a human, it could be difficult for state officials or a court to pinpoint where and why an error originated, said Matthew Dahl, a Yale University doctoral student who co-authored the study on accuracy in legal research AI systems. “These models are so complex that it’s not easy to take a snapshot of their decision making at a particular point in time so you can interrogate it later.”"

[from the article]