Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

18 points

I’m not entirely sold on the argument I lay out here, but this is where I would start were I to defend using chatGPT in school as they laid out in their experiment.

It’s a tool. Just like a calculator. If a kid learns and does all their homework with a calculator, then suddenly it’s taken away for a test, of course they will do poorly. Contrary to what we were warned about as kids though, each of us does carry a calculator around in our pocket at nearly all times.

We’re not far off from having an AI assistant with us 24/7 is feasible. Why not teach kids to use the tools they will have in their pocket for the rest of their lives?

permalink
report
reply
16 points

As adults we are dubious of the results that AI gives us. We take the answers with a handful of salt and I feel like over the years we have built up a skillset for using search engines for answers and sifting through the results. Kids haven’t got years of experience of this and so they may take what is said to be true and not question the results.

As you say, the kids should be taught to use the tool properly, and verify the answers. AI is going to be forced onto us whether we like it or not, people should be empowered to use it and not accept what it puts out as gospel.

permalink
report
parent
reply
9 points

This is true for the whole internet, not only AI Chatbots. Kids need to get teached that there is BS around. In fact kids had to learn that even pre-internet. Every human has to learn that you can not blindly trust anything, that one has to think critically. This is nothing new. AI chatbots just show how flawed human education is these days.

permalink
report
parent
reply
5 points

It’s a tool. Just like a calculator.

lol my calculator never “hallucinated”.

permalink
report
parent
reply
0 points

Ask your calculator what 1-(1-1e-99) is and see if it never halucinates (confidently gives an incorrect answer) still.

permalink
report
parent
reply
3 points

Yeah it’s like if you had a calculator and 10% of the time it gave you the wrong answer. Would that be a good tool for learning? We should be careful when using these tools and understand their limitations. Gen AI may give you an answer that happens to be correct some of the time (maybe even most of the time!) but they do not have the ability to actually reason. This is why they give back answers that we understand intuitively are incorrect (like putting glue on pizza), but sometimes the mistakes can be less intuitive or subtle which is worse in my opinion.

permalink
report
parent
reply
19 points

I think here you also need to teach your kid not to trust unconditionally this tool and to question the quality of the tool. As well as teaching it how to write better prompts, this is the same like with Google, if you put shitty queries you will get subpar results.

And believe me I have seen plenty of tech people asking the most lame prompts.

permalink
report
parent
reply
-6 points

I remember teachers telling us not to trust the calculators. What if we hit the wrong key? Lol

Some things never change.

permalink
report
parent
reply
7 points

I remember the teachers telling us not to trust Wikipedia, but they had utmost faith in the shitty old books that were probably never verified by another human before being published.

permalink
report
parent
reply
3 points

Human error =/= machine unreliability

permalink
report
parent
reply
49 points

no shit

permalink
report
reply
-4 points

“tests designed for use by people who don’t use chatgpt is performed by people who don’t”

This is the same fn calculator argument we had 20 years ago.

A tool is a tool. It will come in handy, but if it will be there in life, then it’s a dumb test

permalink
report
parent
reply
44 points

The point of learning isn’t just access to that information later. That basic understanding gets built on all the way up through the end of your education, and is the base to all sorts of real world application.

There’s no overlap at all between people who can’t pass a test without an LLM and people who understand the material.

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
17 points
*

This is ridiculous. The world doesn’t have to bend the knee to LLMs, they’re supposed to be useful tools to solve problems.

And I don’t see why asking them to help with math problems would be unreasonable.

And even if the formulation of the test was not done the right way, your argument is still invalid. LLMs were being used as an aid. The test wasn’t given to the LLM directly. But students failed to use the tool to their advantage.

This is yet another hint that the grift doesn’t actually serve people.

Another thing these bullshit machines can’t do! The list is getting pretty long.

About the calculator argument… Well, the calculator is still used in class, because it makes sense in certain contexts. But nobody ever sold calculators saying they would teach you math and would be a do-everything machine.

permalink
report
parent
reply
1 point
*

Also actual mathematicians are pretty much universally capable of doing many calculations to reasonable precision in their head, because internalizing the relationships between numbers and various mathematical constructs is necessary to be able to reason about them and use them in more than trivial ways.

Tests for recall aren’t because the specific piece of information is the point. They’re because being able to retrieve the information is essential to integrate it into scenarios where you can utilize it, just like being able to do math without a calculator is needed to actually apply math in ways that aren’t proscribed prescribed for you.

permalink
report
parent
reply
0 points

I didn’t mean it that way, I really just meant the discussion is idiotic

permalink
report
parent
reply
9 points

The main goal of learning is learning how to learn, or learning how to figure new things out. If “a tool can do it better, so there is no point in not allowing it” was the metric, we would be doing a disservice because no one would understand why things work the way they do, and thus be less equipped to further our knowledge.

This is why I think common core, at least for math, is such a good thing because it teaches you methods that help you intuitively figure out how to get to the answer, rather than some mindless set of steps that gets you to the answer.

permalink
report
parent
reply
12 points

As someone who has taught math to students in a classroom, unless you have at least a basic understanding of HOW the numbers are supposed to work, the tool - a calculator - is useless. While getting the correct answer is important, I was more concerned with HOW you got that answer. Because if you know how you got that answer, then your ability to get the correct answer skyrockets.

Because doing it your way leads to blindly relying on AI and believing those answers are always right. Because it’s just a tool right?

permalink
report
parent
reply
0 points

No where did I say a kid shouldn’t learn how to do it. I said it’s a tool, I’m saying it’s a dumb argument/discussion.

If I said, students who only ever used a calculator didn’t do as well on a test where calculators werent allowed, you would say " yeah no shit"

This is just an anti technology, anti new generation separation piece that divides people and will ultimately create a rifts that help us ignore real problems.

permalink
report
parent
reply
28 points
*
Deleted by creator
permalink
report
reply
12 points

This! Don’t blame the tech, blame the grown ups not able to teach the young how to use tech!

permalink
report
parent
reply
3 points

The study is still valuable, this is a math class not a technology class, so understanding it’s impact is important.

permalink
report
parent
reply
-2 points

Yea, did not read that promptengineered chatGPT was better than non chatGPT class 😄 but I guess that proofs my point as well, because if students in group with normal chatGPT were teached how to prompt normal ChatGPT so that it answer in a more teacher style, I bet they would have similar results as students with promtengineered chatGPT

permalink
report
parent
reply
1 point

Can I blame the tech for using massive amounts of electricity, making e.g. Ireland use more fossil fuels again?

permalink
report
parent
reply
1 point

Well, I guess, I mean the evidence is clear, isn’t it?

permalink
report
parent
reply
5 points

Obviously no one’s going to learn anything if all they do is blatantly asking for an answer and writings.

You should try reading the article instead of just the headline.

permalink
report
parent
reply
2 points
*
Deleted by creator
permalink
report
parent
reply
0 points

If you’d have read tye article, you would have learned that there were three groups, one with no gpt, one where they just had gpt access, and another gpt that would only give hints and clues to the answer, but wouldn’t directly give it.

That third group tied the first group in test scores. The issue was that chat gpt is dumb and was often giving incorrect instructions on how to solve the answer, or came up with the wrong answer. I’m sure if gpt were capable of not giving the answer away and actually correctly giving instructions on how to solve each problem, that group would have beaten the no gpt group, easily.

permalink
report
parent
reply
9 points
*

If you actually read the article you will see that they tested both allowing the students to ask for answers from the LLM, and then limiting the students to just ask for guidance from the LLM. In the first case the students did significantly worse than their peers that didn’t use the LLM. In the second one they performed the same as students who didn’t use it. So, if the results of this study can be replicated, this shows that LLMs are at best useless for learning and most likely harmful. Most students are not going to limit their use of LLMs for guidance.

You AI shills are just ridiculous, you defend this technology without even bothering to read the points under discussion. Or maybe you read an LLM generated summary? Hahahaha. In any case, do better man.

permalink
report
parent
reply
24 points

Of all the students in the world, they pick ones from a “Turkish high school”. Any clear indication why there of all places when conducted by a US university?

permalink
report
reply
3 points

If I had access to ChatGPT during my college years and it helped me parse things I didn’t fully understand from the texts or provided much-needed context for what I was studying, I would’ve done much better having integrated my learning. That’s one of the areas where ChatGPT shines. I only got there on my way out. But math problems? Ugh.

permalink
report
parent
reply
23 points

When you automate these processes you lose the experience. I wouldn’t be surprised if you couldn’t parse information as well as you can now, if you had access to chat GPT.

It’s had to get better at solving your problems if something else does it for you.

Also the reliability of these systems is poor, and they’re specifically trained to produce output that appears correct. Not actually is correct.

permalink
report
parent
reply
4 points

I quickly learned how ChatGPT works so I’m aware of its limitations. And since I’m talking about university students, I’m fairly sure those smart cookies can figure it out themselves. The thing is, studying the biological sciences requires you to understand other subjects you haven’t learned yet, and having someone explain how that fits into the overall picture puts you way ahead of the curve because you start integrating knowledge earlier. You only get that from retrospection once you’ve passed all your classes and have a panoramic view of the field, which, in my opinion, is too late for excellent grades. This is why I think having parents with degrees in a related field or personal tutors gives an incredibly unfair advantage to anyone in college. That’s what ChatGPT gives you for free. Your parents and the tutors will also make mistakes, but that doesn’t take away the value which is also true for the AIs.

And regarding the output that appears correct, some tools help mitigate that. I’ve used the Consensus plugin to some degree and think it’s fairly accurate for resolving some questions based on research. What’s more useful is that it’ll cite the paper directly so you can learn more instead of relying on ChatGPT alone. It’s a great tool I wish I had that would’ve saved me so much time to focus on other more important things instead of going down the list of fruitless search results with a million tabs open.

One thing I will agree with you is probably learning how to use Google Scholar and Google Books and pirating books using the library to find the exact information as it appears in the textbooks to answer homework questions which I did meticulously down to the paragraph. But only I did that. Everybody else copied their homework, so at least in my university it was a personal choice how far you wanted to take those skills. So now instead of your peers giving you the answers, it’s ChatGPT. So my question is, are we really losing anything?

Overall I think other skills need honing today, particularly verifying information, together with critical thinking which is always relevant. And the former is only hard because it’s tedious work, honestly.

permalink
report
parent
reply
5 points

I read that comment, and use it similarly, as more a super-dictionary/encyclopedia in the same way I’d watch supplementary YouTube videos to enhance my understanding. Rather than automating the understanding process.

More like having a tutor who you ask all the too-stupid and too-hard questions to, who never gets tired or fed up with you.

permalink
report
parent
reply
0 points
*

The study was done in Turkey, probably because students are for sale and have no rights.

It doesn’t matter though. They could pick any weird, tiny sample and do another meaningless study. It would still get hyped and they would still get funding.

permalink
report
parent
reply
5 points

The names of the authors suggest there could be a cultural link somewhere.

permalink
report
parent
reply
1 point

Ah thanks, that does appear to be the case.

permalink
report
parent
reply
8 points

The paper only says it’s a collaboration. It’s pretty large scale, so the opportunity might be rare. There’s a chance that (the same or other) researchers will follow up and experiment in more schools.

permalink
report
parent
reply
19 points

I’m guessing there was a previous connection with some of the study authors.

I skimmed the paper, and I didn’t see it mention language. I’d be more interested to know if they were using ChatGPT in English or Turkish, and how that would affect performance, since I assume the model is trained on significantly more English language data than Turkish.

permalink
report
parent
reply
3 points

GPTs are designed with translation in mind, so I could see it being extremely useful in providing me instruction on a topic in a non-English native language.

But they haven’t been around long enough for the novelty factor to wear off.

It’s like computers in the 1980s… people played Oregon Trail on them, but they didn’t really help much with general education.

Fast forward to today, and computers are the core of many facets of education, allowing students to learn knowledge and skills that they’d otherwise have no access to.

GPTs will eventually go the same way.

permalink
report
parent
reply
0 points

Yeh because it’s just like having their dumb parents do homework for them

permalink
report
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 554K

    Comments