I think that's actually a good idea? Sucks for e-learning as a whole, but I always found online exams (and also online interviews) to be very easy to game.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Really sucks for people with disabilities and handwriting issues.
It's always sucked for them, and it always will. That's why we make accommodations for them, like extra time or a smaller/move private exam hall.
And readers/scribes! I’ve read and scribed for a friend who had dyslexia in one of her exams and it worked really well. She finished the exam with time to spare and got a distinction in the subject!
Yep, my girlfriend acted as a scribe for disabled students at a university. She loved it, and the students were able to complete their written work and courses just fine as a result.
My handwriting has always been terrible. It was a big issue in school until I was able to turn in printed assignments.
Like with a lot of school things, they do a shit thing without thinking about negative effects. They always want a simple solution to a complex problem.
My uni just had people with handwriting issues do the exam in a separate room with a writer for you to narrate answers to.
People have been going to universities for millennia before the advent of computers, we have lots of ways to help people with disabilities that don't require computers.
Prof here - take a look at it from our side.
Our job is to evaluate YOUR ability; and AI is a great way to mask poor ability. We have no way to determine if you did the work, or if an AI did, and if called into a court to certify your expertise we could not do so beyond a reasonable doubt.
I am not arguing exams are perfect mind, but I'd rather doubt a few student's inability (maybe it was just a bad exam for them) than always doubt their ability (is any of this their own work).
Case in point, ALL students on my course with low (<60%) attendance this year scored 70s and 80s on the coursework and 10s and 20s in the OPEN BOOK exam. I doubt those 70s and 80s are real reflections of the ability of the students, but do suggest they can obfuscate AI work well.
They're about to find out that gen Z has horrible penmanship.
Millennial here, haven't had to seriously write out anything consistently in decades at this point. There's no way their handwriting can be worse than mine and still be legible lol.
As a millennial with gen Z teens, theirs is worse, though somehow not illegible, lol. They just write like literal 6 year olds.
has led some college professors to reconsider their lesson plans for the upcoming fall semester.
I'm sure they'll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible...
No? I'm shocked!
We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.
Particularly for an intro course, remembering what you were told is good enough.
The first step to understanding the material is exactly just remembering what the teacher told them.
There are places where analog exams went away? I'd say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.
Am I wrong in thinking student can still generate an essay and then copy it by hand?
Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.
This isn't exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.
Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper... And then start doing some math.
Calculus exam that let's you use wolfram alpha? Just a really hard exam where you must show all of your work.
Now, with ChatGPT, it's no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.
When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.
So what you’re telling me is that written tests have, in fact, existed before?
What are you some kind of education historian?
You can still have AI write the paper and you copy it from text to paper. If anything, this will make AI harder to detect because it's now AI + human error during the transferring process rather than straight copying and pasting for students.
This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.
And forget about having any sort of integrity or explaining to kids why it's important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦
It's insane talking to people that don't do math.
You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.
It's important that we do math so that we develop a sense of numeracy. By the same token it's important that we write because it teaches us to organize our thoughts and communicate.
These tools will destroy the quality of education for the students that need it the most if we don't figure out how to reign in their use.
If you want to plug your quarterly data into GPT to generate a projection report I couldn't care less. But for your 8th grade paper on black holes, write it your damn self.
Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?
It actually is artificial intelligence. What are you even arguing against man?
Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music
But then the investor wont throw wads of money at these fancy tech companies
Well if i go back to school now im fucked i cant read my own hand writting.
as someone with wrist and hand problems that make writing a lot by hand, I'm so lucky i finished college in 2019
Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?
There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".
My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?
That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.
Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.
Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.
I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.
Chat GPT - answer this question, add 4 consistent typos. Then hand transcribe it.
Might as well go back to oral exams and ask the student questions on the spot.