...if I was a student, I just fundamentally don't think I'd want to be tested by an AI. I understand the author's reasoning, but it just doesn't feel respectful for something that is so high-stakes for the student.
Wouldn't a written exam--or even a digital one, taken in class on school-provided machines--be almost as good?
As long as it's not a hundred person class or something, you can also have an oral component taken in small groups.
Too bad. The premise should be that the instructor, by nature of having the position, already has understanding of the subject. As a student, you do not, and your goal is to gain it. Prompting an LLM to write a response for you does not build understanding. Therefore you should write unhindered by sophistry machines.
But the instructor is not applying their understanding in any way. By delegating the evaluation to AI, there is zero value add vs just asking ChatGPT to evaluate your knowledge and not paying $1000s or $10000s in tuition.
And universities wonder why enrollment is dropping.
I'm not intending to say it's acceptable for professors to use AI entirely in their grading. They obviously ought to contribute. I realize I actually misread your original comment, thinking of "instructor can have AI do his job" as "instructor can have AI to help do his job." Sorry about that. Point being, I think the expectation for real human thought ought to hold for both teacher and student.
A written exam is problematic if you want the students to demonstrate mastery of the the content of their own project. It's also problematic if the course is essentially about using tools well. Bringing those tools into the exam without letting in LLMs is very hard.
I don't entirely disagree but all exams are problematic. We don't have the technology to look into a person's mind and see what they know. An exam is an imperfect data point.
Ask the student to come to the exam and write something new, which is similar to what they've been working on at home but not the same. You can even let them bring what they've done at home for reference, which will help if they actually understand what they've produced to date.
Why is it disrespectful? It is just a task. And it is almost an arms race b/w students and profs. Has always been (smuggling written notes into the exam etc)
The student has a lot riding on the outcome of their exam. The teacher is making a black box of nondeterministic matrix multiplication at least partially responsible for that outcome. Sure, the AI isn't the one grading, but it is deciding which questions and follow up questions to ask.
Let me ask, how do you generally feel when you contact customer service about something and you get an AI chatbot? Now imagine the chatbot is responsible for whether you pass the course.
Talking to a disembodied inhuman voice can be disconcerting and produce anxiety in a way that wouldn’t be true communicating to a live human instructor.
Adding this as an additional optional tool, though, is an excellent idea.
Unless class sizes are astronomical, it's absurd to pay US tuition all to have a lazy professor who automates even the most human components of the education you're getting for that price.
If the class cost me $50? Then sure, use Dr. Slop to examine my knowledge. But this professor's school charges them $90,000 a year and over $200k to get an MBA? Hell no!
The certificate is the value as long as everyone trust it actually certifies what it says is certified. If a diploma can be had for promoting ChatGPT or Gemini a couple dozen times a year, trust in what it certifies should be rapidly eroding and universities should be scared because what you suggest is actually rational.
If I was a professor, I don't think I'd want students submitting AI generated work. Yet, here we are.
Students had and still have the option to collectively choose not to use AI to cheat. We can go back to written work at any time. And yet they continue to use it. Curious.
Students could absolutely organize a consensus decision to not use AI. People do this all the time. How do you think human organizations continue to exist?
Ah yes, collective punishment. Exactly what we should be endeavouring for our professors to do: see the student as an enemy to be disciplined, not a mind to be nurtured.
I know we've had historical record of people saying this for 2000 years and counting, but I suspect the future is well and truly bleak. Not because of the next generation of students, but because of the current generation of educators unable to successfully adapt to new challenges in a way that is actually beneficial to the student that it is supposed to be their duty to teach.
The subject is "AI exams", not "exams". GGP expressed that they believe that AI exams would be an extremely unpleasant experience to have your future determined by, something I find myself in agreement with. GP implied that students deserve this even though it's unpleasant because of their actions, in other words they agree that this is unpleasant but are okay with it because this is punishment for AI cheating. (And which is being applied to all students regardless of whether they cheated, hence the "collective" aspect of the punishment.)
Wouldn't a written exam--or even a digital one, taken in class on school-provided machines--be almost as good?
As long as it's not a hundred person class or something, you can also have an oral component taken in small groups.