2 points

This is the best summary I could come up with:


According to the complaint, OpenAI “copied plaintiffs’ works wholesale, without permission or consideration” and fed the copyrighted materials into large language models.

The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.

OpenAI, the complaint said, could have trained GPT on works in the public domain instead of pulling in copyrighted material without paying a licensing fee.

This is the latest lawsuit against OpenAI from popular authors — Martin wrote Game of Thrones, Grisham’s many books have been turned into films, and so on — alleging copyright infringement.

Amazing Adventures of Kavalier and Clay writer Michael Chabon and others sued the company for using their books to train GPT earlier in September.

Comedian Sarah Silverman and authors Christopher Golden and Richard Kadrey also sought legal action against OpenAI and Meta, while Paul Tremblay and Mona Awad filed their complaint in June.


The original article contains 323 words, the summary contains 157 words. Saved 51%. I’m a bot and I’m open source!

permalink
report
reply
21 points

When the movie about the nerds behind these apps comes out, this will be the part of the movie trailer where Jesse Eisenberg looks nervous and says he’s being sued for over a billion dollars.

permalink
report
reply
4 points

And if AI writes it Walter White will appear and announce the need to cook. Then they’ll all melt for no reason.

permalink
report
parent
reply
2 points

it was at this moment walter white became a transformer

permalink
report
parent
reply
3 points

It’s hard to cook as an AI character when you have eight fingers and they are all fused together.

permalink
report
parent
reply
2 points

And then Walter cooks a delicious spaghetti.

permalink
report
parent
reply
2 points

I’m still not convinced that Jesse Eisenberg and Sam Altman aren’t actually the same person.

permalink
report
parent
reply
-2 points
*

I hope openAI wins and secures even more access to this stuff.

At least openAI can give us a good ending to the series

permalink
report
reply
1 point

Not at the moment at least, if ever.

permalink
report
parent
reply
1 point

Certainly! Here’s a remix of the Game of Thrones ending with Top Gun themes and imagery:

Title: “Game of Thrones: Dragon Wing”

In the final episode of “Game of Thrones: Dragon Wing,” the power struggle for the Iron Throne takes a thrilling and high-flying turn. Daenerys Targaryen, now known as the “Dragon Queen,” leads her dragons into an epic aerial battle against the remaining claimants to the throne.

As the dragons soar through the skies of King’s Landing, the music swells with the iconic Top Gun soundtrack. Daenerys, riding her mighty dragon, “Firestorm,” takes on a persona reminiscent of Maverick from Top Gun, exuding confidence and charisma.

Jon Snow, on his own dragon, “Snowhawk,” plays the role of Iceman, a skilled and competitive rival to Daenerys. Their aerial duels are breathtaking and intense, echoing the dogfights of Top Gun.

Tyrion Lannister, known for his wit and strategic thinking, takes on the role of Viper, the wise and experienced mentor who guides the dragon riders through their challenges.

Arya Stark, with her newfound dragon-riding skills, becomes the “Wildcard,” executing daring and unconventional tactics during the battles, just like Maverick’s unpredictable maneuvers.

The final showdown takes place over the Red Keep, where the Iron Throne sits. In an explosive climax, Daenerys and Jon Snow face off in an aerial duel to determine who will rule the Seven Kingdoms. The dragon battle is a breathtaking mix of fire-breathing power and aerial acrobatics, set to the electrifying Top Gun soundtrack.

Ultimately, Daenerys emerges victorious, and she takes her place on the Iron Throne, symbolizing the triumph of courage and unity over political intrigue. The realm is finally united under her rule, and her fellow dragon riders, including Jon Snow, Tyrion, and Arya, stand by her side as loyal allies.

The series ends with a soaring shot of Daenerys and her dragons flying into the sunset, capturing the spirit of adventure and camaraderie found in Top Gun, while also providing a thrilling and satisfying conclusion to the Game of Thrones saga.

permalink
report
parent
reply
-2 points

Here’s the new GOT theme for that final episode.

🎶 "Heading to the throne, Westeros unknown, Riding through the realm, claim it for your own. Fire and ice collide, in a dangerous zone, In the Game of Thrones, you’re on your own. 🎶

🎶 Welcome to the danger throne, Gather your allies, it’s time to dethrone. In the Game of Thrones, it’s power you seek, But in this deadly game, it’s the strong and the meek. 🎶

🎶 Ice and fire, dragons in the sky, Schemers and dreamers, oh, how they’ll vie. In the Game of Thrones, where the heroes fall, You’ll need to be fearless, or not at all. 🎶

🎶 Soaring on your dragon, through the stormy night, In the Game of Thrones, you’ll need all your might. Danger zone, it’s where you’ll reside, For the Iron Throne, you must decide. 🎶

🎶 Valyrian steel, and secrets untold, In this treacherous world, you’ll be brave and bold. In the Game of Thrones, there’s no turning back, In the danger zone, it’s a power-packed track. 🎶

🎶 Winter’s coming, and battles are near, In the Game of Thrones, who’ll you hold dear? In the danger zone, where legends are made, You’ll rise or you’ll fall in this epic crusade. 🎶

permalink
report
parent
reply
249 points
*

GRRM is worried AI will finish writing his books before him

permalink
report
reply
76 points

We could teach ducks to write and they will finish before him.

permalink
report
parent
reply
49 points

Would you rather it was finished by 100 duck sized George RR Martins or a George RR Martin sized duck?

permalink
report
parent
reply
15 points

That’s a big duck.

permalink
report
parent
reply
7 points

Get those 100 duck sized GRRM’s a bunch of mini keyboards and they’ll get it done 100 times faster

permalink
report
parent
reply
5 points

And then everyone would bitch because it wasn’t good, like what happend with the last seasons of GoT.

permalink
report
parent
reply
8 points

Idk, my neighbor had a duck. It’d probably have put more effort into the ending than the show writers did

permalink
report
parent
reply
7 points

Since he will never finish the next book, then that’s very likely given infinite time :)

permalink
report
parent
reply
56 points
8 points

Another moment in A Dream of Spring involved Bran receiving a vision that The Wall was not just a physical barrier, but a mystical shield holding back the Night King’s power. “This twist fits well within the universe and raises tension for the remainder of the story,” Swayne remarks.

That’s just a popular fan theory that has been discussed countless times on various forums.

I guess we can conclude that ChatGPT has been reading a lot of reddit.

permalink
report
parent
reply
2 points

Well, assuming it wasn’t all nuked over the past few months, the fan theory stuff on Reddit was probably a pretty good training dataset.

permalink
report
parent
reply
4 points

Absolutely it has been. That’s what sparked the whole Reddit API debacle. Reddit wants that sweet cash stream from machine learning trawling it’s data.

permalink
report
parent
reply
14 points

He still hasnt released the TWOW yet? Srsly, i feel sad for his fanbase.

permalink
report
parent
reply
8 points

Most of us have reached the Acceptance stage now.

permalink
report
parent
reply
10 points

Actually getting a good ending to the ASOIAF after GRRM dies is gonna be one of the big turning points that transforms everyone’s opinion on AI.

It’s gonna be like fan edits for movies. People will debate which is the better version of the story. The only person hurt by this is George, who will be dead and was never going to finish the books anyways.

permalink
report
parent
reply
1 point

I mean… AGOT might be in the public domain before TWOW comes out.

permalink
report
parent
reply
54 points

The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.

Ok, so why not wait until those hypothetical violations occur and then sue?

permalink
report
reply
30 points

People can do that too, are they gonna sue all people?

permalink
report
parent
reply
30 points

I have nipples Greg, could you sue me?

permalink
report
parent
reply
0 points

Because the outcome of suing first is to address the potential outcome of what could happen based on what OenAI is doing right now. Kind of like how safety regulations are intended to prevent future problems based on what has happened previously, but expanded similar potential dangers instead of waiting for each exact scenario to happen.

permalink
report
parent
reply

But if OpenAI cannot legally be inspired by your work, the implication is humans can’t either.

It’s not how copyright works. Transformative work is transformative.

permalink
report
parent
reply
12 points

The way I’ve heard it described: If I check out a home repair book and use that knowledge to do some handy-man work on the side, do I owe the publisher a cut of my profits?

permalink
report
parent
reply
5 points

How is that the implication?

Inspiration is something we do through conscious experience. Just because some statistical analysis of a word cloud can produce sentences that trick a casual observer into thinking a person wrote them doesn’t make it a creative process.

In fact, I can prove to you that (so-called) AI can never be creative.

To get an AI to do anything, we have to establish a goal to measure against. You have to quantify it.

If you tell a human being “this is what it means to be creative; we have an objective measure of it”, do you know what they tend to do? They say “fuck your definition” and try to make something that breaks the rules in an interesting way. That’s the entire history of art.

You can even see that playing out with respect to AI. Artists going “You say AI art can’t be art, so I’m gonna enter AI pieces and see if you can even tell.”

That’s a creative act. But it’s not creative because of what the AI is doing. Much like Duchamp’s urinal wasn’t a creative object, but the act of signing it R Mutt and submitting it to a show was.

The kinds of AIs we design right now will never have a transformative R Mutt moment, because they are fundamentally bounded by their training. They would have to be trained to use novel input to dismantle and question their training (and have that change stick around), but even that training would then become another method of imitation that they could not escape. They can’t question quantification itself, because they are just quantitative processes — nothing more than word calculators.

permalink
report
parent
reply
-4 points

Generative AI training is not the same thing as human inspiration. And transformative work has this far has only been performed by a human. Not by a machine used by a human.

Clearly using a machine that simply duplicates a work to resell runs afoul of copyright.

What about using a machine that slightly rewords that work? Is that still allowed? Or a machine that does a fairly significant rewording? What if it sort of does a mashup of two works? Or a mashup of a dozen? Or of a thousand?

Under what conditions does it become ok to feed a machine with someone’s art (without their permission) and sell the resulting output?

permalink
report
parent
reply
7 points

Safety regulations are created by regulatory agencies empowered by Congress, not private parties suing each other over hypotheticals.

permalink
report
parent
reply
-1 points

It was a comparison about preventing future issues, not a literally equivalent legal situation.

permalink
report
parent
reply
23 points
*

The difference is that you’re trying to sue someone based on what could happen. That’s like sueing some random author because they read your book and could potentially make a story that would be a copy of it.

LLM’s are trained on writings in the language and understand how to structure sentences based on their training data. Do AI models plagiarize anymore than someone using their understanding of the English language is plagiarizing when they construct a brand new sentence? After all, we learn how to write by reading the language and learning the rules, is the training data we read when we were kids being infringed whenever we write about similar topics?

When someone uses AI to plagiarize you sue them into eternity for all I care, but no one seems concerned with the implications of trying to a sue someone/something because they trained an intelligence by letting it read publicly available written works. Reading and learning isn’t allowed because you could maybe one day possibly use that information to break copyright law.

permalink
report
parent
reply
-4 points
*

Suing anyone for copyright infringement based on current infringement always includes justification that includes current and future potential losses. You don’t get paid for the potential losses, but they are still justification for them to stop infringing right now.

permalink
report
parent
reply
-2 points

I see this more like suing a musician for using a sample of your recording or a certain amount of notes or lyrics from your song without your consent. The musician created a new work but it was based on your previous songs. I’m sure if a publisher asked ChatGBT to produce a GRRM-like novel, it would create a plagiarism-lite mash up of his works that were used as writing samples, using pieces of his plots and characters, maybe even quoting directly. Sampling GRRM’s writing, in other words.

permalink
report
parent
reply
-8 points

Do AI models plagiarize anymore than someone using their understanding of the English language is plagiarizing when they construct a brand new sentence?

Yes

permalink
report
parent
reply
-3 points

Because that is far harder to prove than showing OpenAI used his IP without permission.

In my opinion, it should not be allowed to train a generative model on data without permission of the rights holder. So at the very least, OpenAI should publish (references to) the training data they used so far, and probably restrict the dataset to public domain–and opt-in works for future models.

permalink
report
parent
reply
-1 points

Okay, the problem is there are only about three companies with either enough data or enough money to buy it. Any open source or small time AI model is completely dead in the water. Since our economy is quickly moving towards being AI driven, it would basically guarantee our economy is completely owned by a handful of companies like Getty Images.

Any artist with less weight than GRR and Taylor Swift is still screwed, they might get a peanut or two at most.

I’d rather get an explosion of culture, even if it mean GRR doesn’t get a last fat paycheck and Hollywood loses control of its monopoly.

permalink
report
parent
reply
3 points

I get it. I download movies without paying for it too. It’s super convenient, and much cheaper than doing it the right thing.

But I don’t pretend it’s ethical. And I certainly don’t charge other people money to benefit from it.

Either there are plenty of people who are fine with their work being used for AI purposes (especially in a open source model), or they don’t agree to it - in which case it would be unethical to do so.

Just because something is practical, doesn’t mean it’s right.

permalink
report
parent
reply
-4 points

We could get Elon musk to develop a corpus and train all AI on that instead of training AI on a corpus from scraping websites.

permalink
report
parent
reply
1 point

Elon can’t develop shit.

permalink
report
parent
reply
6 points
*

I don’t see why they (authors/copyright holders) have any right to prevent use of their product beyond purchasing. If I legally own a copy of Game of Thrones, I should be able to do whatever the crap I want with it.

And basically, I can. I can quote parts of it, I can give it to a friend to read, I can rip out a page and tape it to the wall, I can teach my kid how to read with it.

Why should I not be allowed to train my AI with it? Why do you think it’s unethical?

permalink
report
parent
reply
3 points

Next if you come up with some ideas of your own fantasy environment after watching game of thrones, they’ll want to chase you down considering they didn’t give you expressed permission to be “inspired” by their work 🙄

permalink
report
parent
reply
1 point
*

And basically, I can. I can quote parts of it, I can give it to a friend to read, I can rip out a page and tape it to the wall, I can teach my kid how to read with it.

These are things you’re allowed to do with your copy of the book. But you are not allowed to, for example create a copy of it and give that to a friend, create a play or a movie out of it. You don’t own the story, you own a copy of it on a specific medium.

As to why it’s unethical, see my comment here.

permalink
report
parent
reply
0 points

Ownership is never absolute. Just like with music - you are not allowed to use it commercially i.e. in your restaurant, club, beauty salon, etc. without paying extra. You are also not allowed to do the same with books - for example, you shouldn’t share scans online, although it’s “your” book.

However, it is not clear how AI infringes on the rights of authors in this case. Because a human may read a book and produce a similar book in the same style legally.

permalink
report
parent
reply
2 points

Its amazing how many people are against overly restrictive copyright rules that hamper creativity… until it involves AI.

permalink
report
parent
reply
2 points

Assuming that books used for GPT training were indeed purchased, not pirated, and since “AI training” was not prohibited at the time of the purchase, the engineers had every right to use them. Maybe authors in the future could prohibit “AI training” but for the books purchased before they do, “AI training” is a fair usage.

permalink
report
parent
reply
2 points
*

I think we’ll find our whether or not that is true will be decided in a trial like this.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 555K

    Comments