Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Summary of the 2023 WGA MBA (wgacontract2023.org)
36 points by mattcollins on Sept 27, 2023 | hide | past | favorite | 51 comments


Per the WGA's summary:

1) AI can’t write or rewrite literary material, and AI-generated material will not be considered source material under the MBA, meaning that AI-generated material can’t be used to undermine a writer’s credit or separated rights.

2) A writer can choose to use AI when performing writing services, if the company consents and provided that the writer follows applicable company policies, but the company can’t require the writer to use AI software (e.g., ChatGPT) when performing writing services.

3) The Company must disclose to the writer if any materials given to the writer have been generated by AI or incorporate AI-generated material.

4) The WGA reserves the right to assert that exploitation of writers’ material to train AI is prohibited by MBA or other law.


It reads like the writers are perfectly fine with the AI for as long as they get to use it.


I think the issue was in the old rules writers got paid very differently whether they created an original script vs working on an existing script.

The fear was, studio's would use AI to generate a garbage script then pay a writer far less to effectively completely rewrite it to make it usable.


Or to get an existing script, somehow write a sequel/prequel with AI, and pay some writers way less to polish it.


Yes.

Empowerment of worker (or in this case, creative) = good thing

Replacement of worker = bad thing

This has been the fight over automation since :checks notes: at least 1811 - https://en.wikipedia.org/wiki/Luddite


Writers' use of AI is self regulating.

If the writer is completely opposed to AI, they can omit its use, or if they want, they can use the way they see fit, incl. turning it up to 11.

If the writer's quality decreases because of excessive AI use, it's the writer's problem. They need to regulate their use. If the writer can use it to hone their skills, they can profit from it.

From my personal perspective, as a person who doesn't use xGPT or other models because of unethical training from my perspective, this makes sense.


It reads more like writers don't want their work being used as corpus for ai without them getting paid.

Which is fair, in a society where people need money for food and shelter.


Which is how Hollywood has always worked? You can’t do as much as move a light or push “Record” on a film set without being a union member.

The VFX industry has been an exception. But frankly the deteriorating working conditions, rampant outsourcing to semi-shady companies, and just the overall downwards spiral of the quality of VFX in Hollywood movies suggests that maybe it’s not a model to emulate.


That’s entirely the way it should be! AI used to help labour, not subjugate it.


In the same way I'm OK with knives so long as one isn't being held to my throat.


I think this will have 0 effect. Writers that use AI will push some of the writers that don't use AI out of the market.

What exact scenario have they prevented?

At the extreme end, which won't happen but which would be possible under these rules, there could be a single writer who is basically just prompt engineering and reviewing what the AI spits out, for hundreds of shows.

That a studio would use AI to generate a script without the involvement of a single writer? That wasn't going to happen anyways.

So what was the point of this? Is there something I am missing?


Enter: writing room minimum staffing


well yeah it always is about protectionism and barrier to entry

I find it interesting tho that they are not worried about competition between writers within the association, they will have members that decide for using assisted writing and being a lot more productive than others.


The point is that they can decide for themselves if using AI would benefit them and choose to use it or not

Personally, I wonder how useful AI is going to be in terms of output over the long term. AI will endlessly regurgitate a mash up of what it was trained on in various flavors, but the output will all seem pretty samey after a while since it lacks actual originality. "This reads like something AI wrote" is something I see a lot of already. I'm sure there'll be writers who find it useful, but I don't see it being used for the bulk of their output. At least, I hope they don't just churn out scripts with AI, spend 5 minutes tweaking them, then call it day. I can't imagine that making great material.


How is that any different than the way stories have always been told?

https://en.m.wikipedia.org/wiki/The_Seven_Basic_Plots


This is a writing tool.

You're making the same mistake AI people do. You can create stuff that's like what came before all day, but it won't create anything new. Literary analysis, like these plots and the more common monomyth, is about what already exists, and lags far behind. It's the same deal with music theory. People will spend years in music school learning all kinds of stuff about music, but then they have no idea how to make anything anyone wants to listen to. Music theory as taught in schools is just catching up to jazz, rock, and rap, and there's a lot of resistance.

An AI could probably do some solid analysis, like producing a beat sheet from a novel. That might be helpful. I could pants a draft, then have an AI make the outline for the second draft.


Look at the top 10 movies released this year. Do any of them have a plot that you would consider it anything new?


I'm not a film analyst, so I can't say since I haven't done the work to analyze them.


Not as a film analyst. Just as someone who has seen any of the popular movies that have been released recently. Which ones have had a plot that you walked out of in amazement and didn’t just employ the standard tropes?

But that’s Sturgeons Law in a nutshell I guess


My or your subjective perception of quality aren't really the topic here, are they? You swapped out the subject while you thought no one was looking.

Bringing it back to the point: the movies are popular. You can't make a popular movie from a list of plots in one book built on one guy's subjective analysis. Anyone who's tried to hew too close to any plot formula finds this out. No actual plot out in the real world with any success has a plot that looks like any other. They're unique even if you can boil it down to some list of common plot beats by tossing what makes them unique.

Doing that is fine for teaching a writing class to people who know nothing about writing yet and just need to get started. It won't produce anything good. Real plots branch and loop and evolve.


Have you compared the plot of every single block buster superhero origin story?

The standard “heroes chase after the MacGuffin device” story?

They are called “tropes” for a reason. If you know the tropes you can pick them out very easily from almost any movie.

Every transformers movie plot is basically the same as is every James Bond movie and Mission Impossible movie play to same story.


how does it work out with something as old as grammarly or the advanced spell/grammar/usage/style checking now rolled into Word?


I think this is eminently sensible as a solution to this problem. AI is a tool that skilled professionals can use for productivity. This idea that AI was just going to replace writers never really made much sense - the generation of the actual text is only one part of the job and not the important part of the creative process. This way the writers have the freedom to use this new tooling, the Studios don't get to dictate which tools writers use (just like they're not going to insist on you using a specific brand of computer), and if in aggregate this increases productivity... well that's just great, that's the free market. And importantly, the writer is the one using and experiencing the productivity benefit. In aggregate that might drive the total demand for writers down, but again, that's just the free market.


I have used AI to produce an audio version and translations of my short story. I am not impressed. The best tools generate audio that is of high quality but missed a lot of the nuance of the written text, sometimes distort simple words. When it comes to translation, 80-90% of the sentences had to be fixed for grammar, spelling or other reasons (modifications to the meaning of the sentence, bad handling of gender in gendered languages. I came to the conclusion that writers, editors, proofreaders, and translators have nothing to fear from AI. I also noticed that AI is programmed to avoid getting its creators into legal/PR trouble. For example, if you ask AI to perform certain editing jobs on a piece of content that may be somehow connected to a political or a religious issue it will refuse to do so. Not much help if you want to produce a piece of writing that may be a commentary on such issues, be it an article or a screenplay.


> For example, if you ask AI to perform certain editing jobs on a piece of content that may be somehow connected to a political or a religious issue it will refuse to do so.

Some AI services (especially those run as services by megacorps with lively legal departments) are gimped this way (usually as a sort of "thought police" model running on top of the core model, as I understand), but once you get to self-hostable models not all have such limitations.


How good are the self hostable models though compared to those run by megacorps?


Well, "good" has a few dimensions to it:

1. Speed of output (not very fun to wait multiple seconds for each letter to be output)

2. Coherence of output (how far back does the model remember the context of the conversation?)

3. Variety of output (how's the diversity of the model's vocabulary? How about topics it can plausibly discuss?)

You can easily get comparable speed, so nothing of interest to really compare there.

I haven't done particularly strenuous coherence comparisons, but for my uses, at least, megacorp and self-hosted models are pretty comparable. Though you do need the better models to get the best coherence simply because they retain more tokens in memory.

Variety is, in my opinion, where the megacorp models still rule. Most of my dabbling has been with models designed to be writing assistants and they can certainly generate plausible strings of words and follow a general theme, but they barely "know" anything (generally when using them to write fiction, you would provide them a "factbook" that they can work from). ChatGPT by comparison can generate plausible responses to a surprising breadth of technical questions, although it definitely has a feeling of being generated from scraping certain online sources since it's decent at answering devops questions but bad at obscure grammar and physics questions, at least in my experience.


Doesn't say much if we don't know what AI you're talking about? The best LLM is far ahead of everything else. If it's GPT-4, translation, editing, proofreading is definitely something to fear.


It's one of the big three. I am deliberately not naming it, because I do not want to steer the conversation into "You clearly should have used LLM X". That's not the point here.


>That's not the point here.

Of course it is. If you say don't worry about x technology and you're not using state of the art then it's meaningless. The "Big 3" is meaningless. The best model is far ahead of the rest.


Well, don't be coy then, what's "the best model" and how "far ahead of the rest" is it?


I already said it's GPT-4 and that it's much better than the rest.


Fundamental problem is oversupply of content. Information Overload leading to Attention Theft games.

Its like the Chinese real estate market where more houses have been built than people available. Same story with content and eyeballs available.

Its an extremely unstable demand-supply equation. And it will break down until new ideas about the supply side emerge.


If using AI suddenly means that writers can instantly produce billions of acceptable scripts, the problem will just shift to selecting the good ones from the garbage. I'm not that confident in AI's ability to produce quality writing.


I think this is true, and nothing to do with AI. Netflix bought Big Tech money to hollywood and made it rain. Then all the traditionally media companies followed, creating a bidding war for content. It's probably not popular to say, but the last 5 years have probably been a golden age for hollywood, and some people like Shonda Rhymes have made out like bandits, some people like Megan Markle have literally robbed Netflix blind. But it's all going the other way now, and that's going to be massive downward pressure on the industry, at best AI is a good way of protecting Disney's stock price.


> This idea that AI was just going to replace writers never really made much sense

I disagree. Despite the first taste of the technology cannot completely replace humans, it will get there in an imho relatively short time from now.


When is AI going to replace the rentier class? That should be straightforward, no?


The discussed replacement concerns the production side. A replacement scenario of the pure consumption humans could be when engaging with a human customer will be a loss or pointless nuisance compared to an AI customer.


What about the SAG-AFTRA? Have they reached any agreement regarding AI?

When AI became plausible and produced full body deepfakes, I concluded that it was only a matter of time to use AI to extend and emulate real actors.

The sensible agreement should have been that an actor, via an agency, or a software stack, will control their full body AI avatar. And they will license them to studios for a movie or a commercial ad. They will charge maybe 20%-90% of the fees that they receive.

BUT, it blew my mind that studios were audacious enough to float the idea or propose that actors will be paid a small hourly wage or minimum wage for small actors for scanning their bodies, and after that the studios get to keep and use the digital avatar for perpetuity for free.

Whoa!

I was shocked to learned that. There was, as expected, dispute.

What happened to that part?


That's next, they were only dealing with one union at once.


Can't wait to see how hollywood accounting and business structures run roughshod over this.


"Because 50% of the writing was AI the residual has been halved"


# of Episodes, Minimum # of writers in writers’ room

---------------------------------

6 or fewer, 3

7-12, 5

13+, 6

Minimum number of writers? Ok, go ahead and tell me that union labor isn't about padding payrolls.


Protections against being under-staffed and overworked seems like exactly the kind of thing you'd want in a contract.

That doesn't seem unreasonable to me. I mean, one person could probably come up with six scripts given enough time, but having input from other writers is likely to be very helpful. Even if you thought collaboration had no value whatsoever, having 3 people each working on a script at once means your entire 6 episode season gets done that much faster. Plus you figure life happens and people get sick, need days off, etc. Having more than one body in the writer's room still seems like a good idea.


Why should the contract have to dictate that, when it's the production that would have to pay/suffer if they shortsightedly understaffed something? The writers have sick days and time off protections already. Seems to me this is about guaranteeing employment. If having more than one body in the writer's room is needed, let the production decide that or live with the consequences.


Because often they find ways around the "suffering", e.g., just marketing the hell out of bad stuff and making money on it anyway by getting eyeballs on it. See: Netflix, Amazon, etc. content farms for example.


> when it's the production that would have to pay/suffer if they shortsightedly understaffed something

It isn't just the production that would suffer if the writer is overworked. The writer is the person under pressure to deliver the script on time. Worse, the output would suffer because one stressed out person, who doesn't have the time needed to do the job properly, isn't going to be doing their best work. It's the writer's name that ends up in the credits of whatever gets produced and the quality of the work determines their future employment.


Shows are on very tight schedules. It takes a certain amount of work to produce a quality script for an episode. When they are working, they are probably working extremely long hours for days on end. Having a minimum number of writers ensures the quality remains acceptable without overworking individuals.


"Geez why can't have my facebook clone done in 1 month by just 1 guy and chatgpt right?!"

Writing a script is more involved than it seems, and when push comes to shove I think enforcement of hours/day per script writers is not so great. And you can't come with all ideas by yourself


I don't understand in the slightest how modern "writers" can demand anything at all. Recent scripts everywhere have been disaster after disaster after disaster, with only a handful of mostly indie exceptions.


You have cause and effect confused. The things they struck to stop are why things were getting worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: