Once when I was in college we had an assignment to do on small groups. Some documents had to be produced in a series of steps based on the output of the previous step.
Around the middle of the process our group realised we had made an error. But redrawing and composing all previous outputs was more time than we had left, so we just corrected it in the remaining steps,
and guessed a lesser grade on previous outputs was the best we could achieve by the deadline.
The teacher wanted to give us a failing grade because she or he (funny that I no longer remember what it was) said that step N+1 couldn't follow from step N, so (s)he was sure we had copied our assignment from some other group.
(s)he said (s)he had compared our work with all the others and didn't find whence we copied, and that the assignment was new, no other group in the past had ever seen it, but (s)he was still sure we had copied it from someone.
We managed to argue him/her that we didn't copy anything, and even if the first steps were wrong we couldn't be charged with copying without saying what on earth we copied from, so we finally got a low but passing grade.
In hindsight it would have been better to add some note that we didn't have time to correct all previous steps, maybe. It always depends on the particular teacher mood.
But now it's
automated. A dumbass teacher in Texas gave failing grades to half his class because ChatGPT falsely told him their assignment could be AI generated. Then others showed ChatGPT to tell parts of the teacher's dissertation could also be AI-generated (avant la lettre) and even the teacher's letter condemning the AI-cheating could be AI-cheated.
It'd be fun if it wasn't being appplied to
Emergency rooms and
healthcare. It's still absurd.