this post was submitted on 13 Aug 2023
903 points (97.8% liked)

Technology

58303 readers
23 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 3 points 1 year ago (1 children)

OK Mr Socrates how else would you assess whether a student has learned something?

[โ€“] [email protected] 2 points 1 year ago

Ha ... well if I had answers I probably wouldn't be here! But seriously, I do think this is a tough topic with lots of tangled threads linked to how our society functions. I'm not sure there are any easy "fixes", I don't think anyone who thinks that can really be trusted, and it may very well turn out that I'm completely wrong and there is not "better way", as something flawed and problematic may just turn out to be what humanity needs.

A pretty minor example based on the whole thing of returning to paper exams. What happens when you start forcing students to be judged on their ability to do something, alone, where they know very well that they can do better with an AI assistant? Like at a psychological and cultural level? I don't know, I'm not sure my generation (Xennial) or earlier ever had that. Even with calculators and arithmetic, it was always about laziness or dealing with big numbers that were impossible for (normal humans), or ensuring accuracy. It may not be the case that AI is at that level yet for many exams and students (I really don't know), but it might be or might be soon. However valuable it is to force students to learn to do the task without the AI, there's gotta be some broad cultural effect in just ignoring the super useful machine.

Otherwise, my general ideas would be to emphasise longer form work (which AI is not terribly useful for). Work that requires creativity, thinking, planning, coherent understanding, human-to-human communication and collaboration. So research projects, actual practical work, debates, teaching as a form of assessment etc. In many ways, the idea of "having learned something" becomes just a baseline expectation. Exams, for instance, may still hold lots of value, but not as forms of objective assessment, but as a way of calibrating where you're up to on the basic requirements to start the real "assessment" and what you still need to work on.

Also ... OK Mr Socrates ... is maybe not the most polite way of engaging here ... comes off as somewhat aggressive TBH.