A Massachusetts couple claims that their son’s high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.

An old and powerful force has entered the fraught debate over generative AI in schools: litigious parents angry that their child may not be accepted into a prestigious university.

In what appears to be the first case of its kind, at least in Massachusetts, a couple has sued their local school district after it disciplined their son for using generative AI tools on a history project. Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments and that the punishment visited upon their son for using an AI tool—he received Saturday detention and a grade of 65 out of 100 on the assignment—has harmed his chances of getting into Stanford University and other elite schools.

Yeah, I’m 100% with the school on this one.

0 points

I hope him and his parents get bullied.

permalink
report
reply
9 points
*

When I was a kid, we had a period of some repetitive math work I got sick of. So I wrote a TI-84 program to automate it, even showing its work I would write down.

I wasn’t really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.

It would be awesome if there was an AI “equivalent” to that. Like some really primitive offline LLM you were allowed to use in school for basic automation and assistance, but requires a lot of work to set up and is totally useless without it in. I can already envision ways to set this up with BERT or Llama 3B.

permalink
report
reply
8 points
*

It would be awesome if there was an AI “equivalent” to that

It’s called your brain / learning. That’s why you’re there. If the specifics of the curriculum are too tedious, that’s on the school to address.

Learning how to parse and comprehend information to find an answer is just as important as the answer.

permalink
report
parent
reply
3 points

to be fair, understanding something well enough to automate it probably requires learning it in the first place. Like obviously an AI that just tells you the answer isnt going to get you anywhere, but it sounds more like the user you were replying to was suggesting an AI limited enough that it couldnt really tell you the answer to something, unless you yourself went through the effort to teach it that concept first. Im not sure how doable this is in practice, My suspicion is that to actually be able to be useful in that regard, the AI would have to be fairly advanced and just pretend to not understand a concept until adequately “taught” by the student, if only to be able to tell if it was taught accurately and tell the student that they got it wrong and need to try again, rather than reinforce an incomplete or wrong understanding, and that theres a risk that current AI used for this could instead be “tricked” by clever wording into revealing answers that its supposed to act like it doesnt know yet (on top of the existing issues with AI spitting out false information by making associations that it shouldnt actually make), but if someone actually made such a thing successfully, I could see it helping with some subjects. I’m reminded of my college physics professors who would both let my class bring a full page of notes and the class textbook to refer to during tests- under the reasoning that a person who didnt understand how to use the formulas in the text wouldnt be able to actually apply them, but someone who did but misremembered a formula would have the ability to look them up again in the real world. These were by far some of the toughest tests I ever had. Half of the credit was also from being given a copy of the test to do again for a week as homework, where we were as a class encouraged to collaborate and teach eachother how so solve the problems given, again on the logic that explaining something to someone else helped teach the explainer that thing too.

permalink
report
parent
reply
1 point
*

You worded this much better than I could.

Yes I was thinking of two directions:

  • A “smarter” AI, though I think a better term would be “customized,” specifically tailored to only help with knowledge that the student already “learned” in the context.

  • A “dumb” AI thats too unreliable to use for lazy ChatGPT style answers, but can be a primitive assistant to bounce ideas off of or help with phrasing, wording, formatting and basic tasks that are too onerous or trivial for a human/student to help with.

Not many people are familiar with the latter because, well, they only use uncached ChatGPT, but I find small LLMs to already be useful as a kind of autocomplete or sanity check when my brain is stuck (much like it was without my TI84 BASIC program), and the experience is totally different because the response is instant (as the context is cached on your machine).

permalink
report
parent
reply

If the specifics of the curriculum are too tedious, that’s on the school to address.

This! This right here. So many school curricula are designed by people who seem to despise children and want to make them suffer that I wonder why we bother with schools at all sometimes.

(Of course I also refer to Chinese high schools as institutionalized child abuse, so what do I know?)

permalink
report
parent
reply
4 points

I wasn’t really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.

This is the way it should be. If you created the program on your own, as opposed to copying it from elsewhere, you had to know how to do the work correctly in the first place. You’ve already demonstrated that you understand the process beyond just being able to solve a single equation. You then aren’t wasting time “learning” something you’ve already learned just to finish an otherwise arbitrary number of problems.

permalink
report
parent
reply
-11 points

Some of you are only entering this conversation for the very first time and boy does it show.

permalink
report
reply

What would the parents’ stance be if he’d asked someone else to write his assignment for him?

Same thing.

Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments

I’ll bet you the student handbook doesn’t explicitly prohibit taking a shit on his desk, but he’d sure as Hell be disciplined for doing it. This whole YOU DIDN’T EXPLICITLY PROHIBIT THIS SO IT’S FINE!!!111oneoneeleventy! thing that a certain class of people have is, to my mind, a clear sign of sociopathy.

permalink
report
reply
1 point

Also known as the Air Bud defense.

permalink
report
parent
reply
18 points

Basically their stance is that the school policy didn’t explicitly say he couldn’t use AI, so perhaps the policy specifically mentions another person doing the assignment?

permalink
report
parent
reply

You know, now that I think about it, if I were in an admissions office I’d be keeping a quiet database of news stories like this so I know which people I would automatically reject no matter what their scores.

permalink
report
parent
reply
7 points

I probably wouldn’t go to the trouble of making a database of students who might never apply to my school, but now I’m wondering about the legality of background checks or even cursory Google searches as part of the admissions process, because it would surely show up there.

permalink
report
parent
reply
13 points

Yep, make that part of their so called permanent record.

If you work in a job for a year or more (sometimes less), it will become very clear which of your co-workers cheated their way through school. They’re the absolute worst to deal with professionally, and I hate them for constantly producing slop.

permalink
report
parent
reply
25 points

their stance is that the school policy didn’t explicitly say he couldn’t use AI,

According to the school’s lawyers, the policy against AI was stated in a presentation that the student attended, and the policy against AI was handed out at a parent’s night and on an online portal, see pg 4-6 of the following: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.13.0.pdf

permalink
report
parent
reply

Hah! So it’s even worse! It actually was explicitly prohibited and the parents are still suing!

Definite cluster of sociopathy there.

permalink
report
parent
reply
6 points

Reminds me of some bass-ackwards story I read about boardgames. A couple was saying “the rules don’t forbid this” so they were putting pieces in the wrong places. What a nightmare that would have been.

permalink
report
parent
reply

People who do that at my games table get uninvited from games nights. I might also point out that the rules don’t forbid me tossing my glass of baijiu into their faces but they’re probably thankful I’m not doing it.

permalink
report
parent
reply
4 points

The way I see AI as a tool in a classroom or learning setting is that you should be punished if you willingly used it due to laziness, not understanding the course work, or I assume most likely both. On its own it’s not terrible (environment aside), but it’s certainly not something I’d accept if I were a teacher grading homework.

permalink
report
reply