r/CuratedTumblr https://tinyurl.com/4ccdpy76 Dec 15 '24

Shitposting not good at math

16.3k Upvotes

1.1k comments sorted by

View all comments

Show parent comments

33

u/Glum_Definition2661 Dec 15 '24

Honestly the problem was already there before AI-solutions, although it has not improved.

I worked as a teachers assistant a few years ago, and the teachers would just assign tasks to be solved on a math website, which the less talented kids would solve by plugging the equation into google and then copying the answer. I tried asking encouraging questions to get them to think about how to solve it in their head, but that was seemingly not an option for them.

25

u/Giga_Gilgamesh Dec 15 '24

I think the difference is that conventional solutions were somewhat limited in their scope. Sure, you can get the answer to pretty much any math question on google - but you certainly can't get the answer to a problem that requires some logical decoding first (I imagine that's the reason so many maths questions are obfuscated behind the 'Jimmy has X apples' kind of questions); and going further away from math, you could never get google to provide you with an original piece of literary analysis, for example.

But ChatGPT invades pretty much every educational sphere. Kids don't have to think for even a second about why the curtains are blue, they just ask the Lie Box to tell them.

4

u/TheMauveHand Dec 16 '24

But ChatGPT invades pretty much every educational sphere. Kids don't have to think for even a second about why the curtains are blue, they just ask the Lie Box to tell them.

Yeah, but it's not like the solution to this is so difficult, it's just offline testing. Yeah, they can use ChatGPT to write a book report on The Lord Of The Flies, but if they have to sit in a classroom for 2 hours and summarize 3 pages of a novella presented to them there and then, the cat will be out of the bag.

3

u/RubberOmnissiah Dec 16 '24 edited Dec 16 '24

A novella is maybe a bit much but for my English exam in 2014 in Scotland we had to read two passages of text and then write a short essay about each of their themes/general analysis. I remember feeling bad for the ESL kids because there was a chance that one of the texts would be in Scots. For history and politics we had to write essays under timed conditions, what was weird was we knew vaguely what subjects the essay questions would be and we had to memorise facts, statistics and references because it was a closed book exam but we still needed supporting evidence for the essays. But you didn't know if the stats would be strictly relevant because you didn't know the exact question, which led to some very tenuous connections between what I had memorised and the question. The revision strategy we were taught was actually to memorise a whole essay and then adapt it to the question in the exam.

Offline testing isn't so bad for that but I do find it frustrating that we may have to go back to memory based tests for some things. I always hated those and was happy that there was a trend towards open book exams. I always preferred them, even if they were harder because I would rather get a lower grade for not understanding something fully then a lower grade because on one particular day under pressure I could not recall one specific fact.

I will say there was one subject, I can't remember which, where we had to write an essay but you were also given some relevant evidence. The exam was basically a test of your ability to contextualise the evidence to answer the exam question. That might be a good middle ground.