this post was submitted on 22 Oct 2024
119 points (88.4% liked)

Not The Onion

12172 readers
647 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
 

The lawsuit says the Hingham High School student handbook did not include a restriction on the use of AI.

"They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB. "They basically punished him for a rule that doesn't exist."


cross-posted from: https://lemmy.zip/post/24633700

Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.8.0.pdf
Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.13.0.pdf

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 62 points 1 week ago* (last edited 1 week ago) (3 children)

I'm guessing they probably have rules against plagiarism, or passing off other people's work as your own.
So then I guess it would be down to whether using AI (without disclosure?) is plagiarism or not

[–] [email protected] 55 points 1 week ago* (last edited 1 week ago) (4 children)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

It’s a massive grey area, and the sum of these kinds of cases are what will define ownership of LLM output for the next ~50 years.

Don’t get me wrong, kid absolutely did not comply with the spirit of the assignment.

E: @[email protected] makes an excellent point:

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[–] [email protected] 24 points 1 week ago (2 children)

The LLMs can claim whatever they like, it holds no weight or value. They are basically advanced plagiarism engines and the law has already made it clear you cannot copyright the output of an LLM.

This particular case will go nowhere, but there are plenty of legal cases between content creators and AI makers that are slowly moving through the legal system that will go somewhere.

[–] [email protected] 3 points 1 week ago

the law has already made it clear you cannot copyright the output of an LLM.

That’s true in this context and often true generally, but it’s not completely true. The Copyright Office has made it clear that the use of AI tools has to be evaluated on a case-by-case basis, to determine if a work is the result of human creativity. Refer to https://www.copyright.gov/ai/ai_policy_guidance.pdf for more details.

For example, they state that the selection and arrangement of AI outputs may be sufficient for a work to be copyrightable. And that’s without doing any post-processing of the AI’s outputs.

They don’t talk about situations like this, but I suspect that, if given a prompt like “Rewrite this paragraph from third person to first person,” where the paragraph in question is copyrighted, the output would maintain the same copyright as the input (particularly if performed faithfully and without hallucinations). Such a revision could be made with non-LLM technology, after all.

[–] [email protected] 2 points 1 week ago

So who owns the copyright then? Is the output just public domain?

[–] [email protected] 20 points 1 week ago* (last edited 1 week ago)

It doesn't matter what the LLM license states. Replace the LLM with a person doing exactly what the LLM does and ask yourself if it is plagiarism.

If I do your homework for you and I say, "Because you prompted me with the questions, the answers belong to you." That isn't a free 'get out of plagiarism card' for you. What I tell you isn't relevant.

It's not gray at all.

Edit: that's weird. I got a personal message but the reply showed up here.

[–] [email protected] 10 points 1 week ago* (last edited 1 week ago)

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[–] [email protected] 9 points 1 week ago* (last edited 1 week ago)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

Who cares what they say to avoid being sued for copyright infringement?

[–] [email protected] 18 points 1 week ago (2 children)

I sometimes use an LLM to "tidy up" my work and paste a bunch of writing in to see if it comes up with anything better. Some parts it will, others it won't, and I'll use or tweak some of it. I wonder if that counts? It's all my work going in, but it's using other people's work to make adjustments.

[–] [email protected] 17 points 1 week ago (1 children)

Replace LLM with a person. If it was a person editing your work, does it make it plagiarism?

A common proofreading technique is to give your work to another person to read and make comments. That's not plagiarism.

[–] [email protected] 6 points 1 week ago (1 children)

People who proofread only generally make recommendations to edit. LLMs often "rewrite" the vast majority of the document.

If I tell a person who's my editor the concept of my paper and about 20-30% of the actual content that's in the end paper... sounds like someone else wrote the paper to me.

It's all up to how you're using the tool. Lots of kids out there will simple tell chatgpt to write something for them. Other's will simply ask for basic proofreading. It's a bitch to tell the difference on the grading side.

[–] [email protected] 2 points 1 week ago (1 children)

Yes, that's exactly my opinion on the subject. ( I realize this is a contentless reply but I didn't want you to think I downvoted you.)

[–] [email protected] 1 points 1 week ago

I didn’t want you to think I downvoted you.

I'm admin on my small instance. I can see the votes. No worries. In this case the downvote is from [email protected].

Anyway, the most I ever use LLMs professionally for is to help rearrange content for better flow or maybe convert more rambly bits into something that's concise. I tend to be more verbose than I need to be (mostly because my documentation for stuff is wildly verbose since I tend to forget stuff, which is great for documentation... not always great for talking through something for a client).

[–] [email protected] 3 points 1 week ago

I write my own papers, but will put paragraphs through an llm and ask it how it can be improved (normally grammarly's 'ai'), and sometimes I take it's advice, but half the time I dislike what it's done. Sometimes I give it a bunch of information on what I need to write, and it'll spit something out, and then I'll sort of use it as a skeleton for my paper, but to be honest, it's kind of shit, regardless of which one I've tried. And it lies. So much.

[–] [email protected] -1 points 1 week ago

But those rules don't apply here.