Reagan the Actor had very different politics than Reagan the Candidate.
Check out his stump speech for Truman[1]. Quick highlight quote:
“Remember that promise: a real increase in income for everybody. But what actually happened? The profits of corporations have doubled, while workers wages have increased by only one-quarter. In other words, profits have gone up four times as much as wages, and the small increase workers did receive was was more than eaten up by rising prices, which have also bored into their savings” - Somehow Ronald Reagan
It's almost as if people will advocate for their own interests, and if their interests change, so too does their advocacy. As the SAG leader, he would obviously advocate for more money for himself and his fellow actors, saying whatever rhetoric he needed to, and when he was president, he would similar advocate for whatever keeps him further in office.
So he got smarter on economics as he aged like most people? His economics advisors were at the top of their fields, including Nobel laureates. It is not highly shocking that he changed his views.
> On August 5, following the PATCO workers' refusal to return to work, the Reagan administration fired the 11,345 striking air traffic controllers who had ignored the order, and banned them from federal service for life.
There’s an enormous difference between Hollywood actors striking and air traffic control.
Air traffic controllers weren’t even allowed to strike given how critical their jobs are and could have paralyzed the entire country. Comparing that to actors is nonsensical.
The uneven comparison is sort of the point: If air traffic controllers are so essential to the country's operation, why didn't the Reagan administration just avoid the strike by giving them what they wanted, like he was able to get personally in 1960? Is the idea that we can only give non-essential workers what they want (because they have the luxury of striking and not screwing over the country) and everyone else can get bent? That's the idea that's nonsensical to me and discordant between Reagan the actor and Reagan the president.
> why didn't the Reagan administration just avoid the strike by giving them what they wanted, like he was able to get personally in 1960?
His administration actually made some pretty generous offers in the negotiations. Part of the reason the negotiations broke down is because PATCO's radical wing, which believed that getting the best deal required striking first, ended up in control by the time the Reagan administration came to power. That soured Reagan's position since he felt, despite the generous offers, that the negotiations were being handled in bad faith.
Yep, I consider Collision Course to be the definitive history: https://a.co/d/18t0JmK which traces air traffic controllers in the US from long before PATCO, up to the fateful Reagan termination, and the fallout afterwards. Fascinating, even-handed between labor and management, couldn't put it down.
They asked for a $10k raise and a 32 hour workweek due to burnout, hard to argue that this was an unreasonable ask.
This was the exact same paradox that essential healthcare workers faced during the pandemic. Too important to give time off, but not important enough to properly compensate.
That's what I don't get about these demands. If Hollywood wants to scan background actors and reuse them instead of paying them a day rate, then surely almost anyone can be scanned and used in such a process. At that point they aren't even actors, they're just sources of pretty faces for the VFX team. And that isn't even necessary is it because GANs can imagine pretty faces for years now.
Presumably if actors pick this as a hill to die on, well, suddenly there'll be even more surplus supply of actors to demand because the easy roles are VFX, and at that point Hollywood can just break the unions with scabs.
Look, in the not-too-distant future, creatives are just screwed. I don't think there's any possibility of halting the advance of this technology. The only reasonable play is for government to intervene to capture the prosperity that it generates and distribute it to the population instead of letting the shareholders keep all of it. It's that or we eat the rich.
This is playing out in Hollywood right now, but it's coming soon to a theater near you.
If your timeline for “not-too-distant” is measured in decades at the earliest and centuries otherwise, maybe. It doesn’t look like authentically enjoyable creative output is making nearly as many strides as summarization and code completion. GPT-MidJourney autopoiesis similarly degenerates into novel-but-unremarkable prompts & images.
Maybe a limit of training these systems on internet text and images, but it seems to me that we’d need something trained on the human experience itself to get expression that doesn’t read hopelessly derivative. Now and likely for a long time, all these tools need a puppet master.
And, of course, celebrity culture requires a celebrity in the flesh. At which point an AI analogue would be so sufficiently similar to us that they might be afforded a SAG membership themselves.
Perhaps, though the rate advance, as mentioned, seems headed in a direction entirely orthogonal to actual creative output that might threaten art-making. I’m in awe at the grammatical or visual fidelity of LLMs/GANs, but have yet to hear a funny joke or see an original image.
My impression of the appetite for derivativeness is that it’s waning, no? The authority on the craft of content milling seems to think so. [0]
e.g. the constant re-hashes of DC and Marvel, mediocre Star Wars spinoffs and sequels, live action remakes of things that didn't need remakes, etc etc etc
the single largest selling category of books is romance novels, and if you've ever spent time in that space -- I used to worked in a book store, and would read anything to kill time, including romance -- is often painfully derivative and repetitive. Amazon was flooded with e-book romance novels long before ChatGPT, and it's only going to get worse.
Before photography, if you wanted any likeness, for a book or a desk family picture, you needed a painter to paint it by hand. Then you needed a photographer. Then you needed an expensive camera and a photo shop. Now you just need a smartphone.
Photographers still exist, but there is no longer a market for "basic" photography. Similar with the horse industry: at one point it was the only way to move things overland. Now it still exists, mostly as a luxury, and supports a tiny industry.
I very much sympathise with the people affected by this. But also I guess a lot of what Hollywood produces isn't high class art. It's the horse plodding from A to B, painter drawing a boring suburban family equivalent. Reading a synopsis of the "fast and furious" film series, for example, an AI could probably happily do all of it, better, and with less brain damage to the poor humans involved in the creation (the first 3 films I enjoyed).
Either way, if AI chips away at jobs, we need to figure out how to take care of the people left behind.
I think we are entering new territory with owning peoples appearance and using them as puppets on a string to do and say whatever you desire. Your clone can quite easily be more famous than the original you.
We might think we can be mature about it but one will be one shot away from losing the other job as well. We cant have a bus driver who..... woah!
Some time relatively soon I expect the studios won't even need to scan extras but will instead be able to generate 3D models. This is probably why the studios don't really care about maintaining the ecosystem of hairdressers and makeup people - they don't see it lasting that long any way.
Game engines with generated art seem like a plausible future for entertainment. AI writers generate concepts, do drafts, revisions, and maybe in a few years just do the whole thing.
This seems inevitable to me. Bad for actors, good for consumers as it will mean much more content. I also think this system will eat Hollywood in the end too. The technology will reduce the cost for developing high quality films and will commoditize the film industry deleting any most Hollywood has.
I think the same way. The demos from Unreal Engine 5 are incredible. Combine it with LLMs and we really are not too far from being able to as the computer to generate a story and giving it some details, and having it be rendered and look pretty damn lifelike. That 5, maybe 10 years away at most.
I don't think the movie/tv industries will disappear, but they will be lessened when people can generate at least reasonable looking entertainment from home.
I think movies and tv will basically become like books already are. People might have read some best sellers, but there won't really be anything that is super widespread or you can count on most people having seen like Seinfeld or The X-Files.
Easy to imagine in ten or twenty years people will be several "seasons" into shows that no one else has ever seen. Combine auto generated content with TikTok style customization and infinite scroll...
> good for consumers as it will mean much more content
It is my personal belief consumers could benefit from reducing their medium consumption, and many of us myself included could touch a little more grass
Yeah I came to say this. The striking actors maybe have another 5 years max before studios can just generate AI actors and won't need to scan humans anyway.
There's still room for face capture mocap for actors to deliver credible human performances. I see tech getting more accessible so you don't need studio level resources to produce content. More amateur talent can work together on sustainable niche productions to cater to specific tastes. Actual golden era of non market research content, but also a lot of shit fanfics.
Why is this the fault of AI researchers? Hollywood executives are trying to own everyone's likeness for free. That has nothing to do with AI; it's run of the mill greed.
They tried to do the same thing to Crispin Glover back in the 80s. There are many ways to reproduce someone's likeness, and diffusion models are just the latest. Hanging a body double upside down accomplishes the same thing.
It's the fault of AI researchers in the same way that deaths in conflict zones are the fault of weapons manufacturers. It doesn't matter who pulls the trigger, since whether the weapons exist or not is the primary factor determining whether they can be used by malicious actors.
AI can do good and bad, just like fire or nuclear fission. A world with AI is probably a better world, but it's a technology that comes with many caveats. Blaming the researchers for misuse is unfair.
It's a yes-and situation of course. Pulling the trigger comes with its own blame, but let's not pretend that no one working on such technologies could see this kind of (mis)use coming.
True, maybe some of these researchers were encouraged by the misuse for profits. At the same time, the breakthroughs didn't depend on a single genius insight by a couple of reseachers, a Manhattan-type project or a tablet from aliens.
The results can be considered inevitable in the sense that reflects very capable people incrementally advancing the status quo as part of their research jobs. We need to build the necessary frameworks for dealing with this, and other advancements, because they will come irrespective of our wishes.
I think it's interesting that we don't have a well-known philosophical framework for discussing "could vs. should" problems in technology development. The argument comes up time and again; the response to my balrog comment was exactly what I was expecting - gun makers vs generals. I'm starting to think it's one of humanity's major blind spots.
It's a branch of ethics, surely worthy of a bit of square footage in an ivory tower.
Humanity, broadly, discusses ethics quite a bit and I think the philosophers among us have set us up with the tools to have good conversations about it.
But tech-adjacent people seem bored and dismissive by questions of ethics as a general rule. If you were in STEM in college, you may recall the groans and avoidance of the (likely mandatory) ethics courses among your cohort.
Bioethics is the most prominent example of such a study because of the possibility of immediate and direct harm on organisms. I agree it would be worthwhile to formalize an ethics framework for tech in general, we don't seem to have a proper title for such a field of study so you're right that we are blind to it as a society.
Your analogy makes zero sense. You can’t go to war without weapons, but there are a million ways to copy someone’s likeness that don’t involve AI. To name a few: Masks, makeup, 3D scans and head tracking, cartoons (live action character depicted in cartoon form). Owning the rights to your likeness has nothing to do with AI and never has.
The premise that nuclear energy enables nuclear weapons is ridiculous. Nuclear energy does not require the level of refinement that nuclear weapons do. Creators of fuel for nuclear energy are not creating materials suitable for use in nuclear weapons.
I suppose, if we ever get to the point where humans don't have to work, this is how the start of it looks like. Also the start of terrible dystopias...
But first we'll see AI chip away at jobs. More and more revenues will flow into AI and their owners and less into the hands of everyone else. Eventually AI would be the only game in town, plus an army of unemployed people. The only thing to do then would be to tax or socialise the profits of the AI and feed the masses with the proceeds.
Or see enormous inequality and a repeat of the French revolution. How apt to write this on Bastille day!
In fact, I think there is little difference between the AI hegemony and the landowner one on 18th century. Back then you had a tiny slither of population own all the land, most of everyone else was doing menial underpaid jobs.
Back then though, there was less surveillance , less of a feeling of alienation among your countrymen. Nowadays, you'd be hard pressed to find anyone willing to join some revolution
Yeah, I don't understand either, hardware advances have made it such that, unlike when I was a child washing each and every piece of clothing by hand, I can now just chuck em into machines and it comes out perfectly each time.
Artists are on alert now, but after spending a few weeks rewriting a project using a new-to-me framework and language with our current "Model T" generative AI as my pair programmer, I'm reasonably confident that software and hardware engineers are going to be in the middle of their own existential crisis soon.
Perhaps the problem is people needing to be employed to justify their existence or as a proxy measure to their ongoing contribution to society and its degree?
I’m a firm believer in a post toil to survive reality. The end of employment is godliness would be a great thing. I’d rather see artists create art than work in artist hamster wheels for the health insurance.
“I am, somehow, less interested in the weight and convolutions of Einstein’s brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.”
― Stephen Jay Gould, The Panda's Thumb: More Reflections in Natural History
I generally agree, but the specific premise here is vile. Some megacorp gets rights to your digital puppet that they can exploit to their pleasure, while the actors get almost nothing? Yikes.
Even if we didn't need to justify our existence, a lot of actors would probably still like to & want to act.
Oh, I agree. The concept of being capable of being coerced into signing away your likeness in perpetuity is absolutely vile. In a real sense it's the only thing you own that can't be separated from you. These contracts should be illegal just in the way signing a contract to endenture yourself is illegal.
I don't understand the likeness thing. We can reliably generate faces of all kinds. Why pay anyone for it when it can just be generated? Are they worried the generated one will look like a real person and open them to liability?
Well, they're not buying the 'likeness' in most cases as much as they're buying the actor's 'star power' - i.e. their ability to trade on their own name and performance independently of any one media product. That is, simply having their name attached to something makes that thing more likely to do well financially. This is what they're buying - and why casting "unknowns" in a "big budget" production is rather unusual. It's more risky than simply paying, say, Chris Pratt or Scarlett Johansson to perform. As you might imagine, investors don't like risk except the kind that guarantees them returns.
It could be quite useful/profitable for a Hollywood studio to acquire the rights-in-perpetuity to a young actor's image in 2023 for $200, if that same actor then goes on to become a superstar.
Harbinger of things to come. AI will come for us all eventually.
It’s unnerving that everyone is so enthusiastic about advancing AI but put zero effort into figuring how society is going to function with significant unemployment.
> It’s unnerving that everyone is so enthusiastic about advancing AI but put zero effort into figuring how society is going to function with significant unemployment.
We know exactly what happens next. It was laid out in Soylent Green and Day After Tomorrow
We are almost at the stage(are we already?) where a fully synthetic actor can be created with no known actor's face at all, who will perform a fully synthetic script, with some directors fiddling to make it move/fit/lips etc.
Will it be watchable? Barely, but it will get better fast.
This is the analog of a carpenter with a hand drill versus a high speed punch press - what did the furniture makers do - grew old, and the customers went to Ikea.
Skip to the end game - what will actors/writers do? WIll the stage return? or will correctly shaped MIT robot dogs be used? https://www.youtube.com/watch?v=NgoYGL9m_xE
I am not sure how this will unfold or how quickly it will happen or how we will adapt, but it will proceed apace.
It is an extension of the machine evolution of the last 100 years, from Henry Ford to all manner of personal function, actor, dentist, surgeon, you name it.
I am not sure how soon dentists will be replaced, it will be easy to automate certain fixed stages of function - much like robotic prostatectomy is now being done. https://www.wikidoc.org/index.php/Prostatectomy
On the other hand, making a movie is terrifyingly expensive. Substantially reducing the cost will mean more movies get made, which means more demand for front line actors.
After all, what happened when the printing press replaced the monks copying books by hand, letter by letter, in ink?
A lot of the cost is top-billed actors' salaries. They could reduce that greatly by simply not using A-list actors (unless they agree to a huge paycut), but they don't want to do that because A-list actors bring more viewers. Meanwhile modern movies usually have lousy scripts. This problem is one that the movie studios have created all by themselves.
Perhaps, but you'll probably wind up with a shitty movie. Books and movies are fundamentally different media, and a story that works well in a book usually, IMO, just won't translate well to a 2-4 hour movie without making major changes to the story and plot.
However, perhaps instead of trying to make movies from books, they should make miniseries (3-10 hour-long episodes).
You're right: the current quality of film/TV scripts is terrible, so this is definitely one place where LLMs could do far, far better than most humans in the profession.
I'm not referring to the technical aspects of filming, I'm referring to the huge differences between them as a storytelling medium. A movie only has 1.5-3 hours of screen time available to tell the complete story (unless you do something like the recent "Dune" movies, or make a trilogy like LotR), but in addition to that, a lot of things that work for books don't work the same way for movies: pacing, etc.
Nah, just generate the A listers, or make better actors that are better looking and have better voices. Generate the directors, generate the whole movie. Put me in the movie, make me better looking. Toss the studios.
More movie means more choice in the market to the point of confusion among consumers. Further increasing film output may be detrimental to the market; moviegoers may get alienated by unfamiliar choices, and films may find it more difficult to recoup costs.
The reasons franchises, especially superheroes, have been thriving in the last two decades is because of the excessive choice of movies. The general public struggles to pick which one to watch, so they pick franchises with expected plots. This market trend is partly why Disney's Bob Iger bet on franchises, acquiring Marvel and Lucasfilms.
The paradox of choice is best illustrated in Sheena Iyengar's "The Art of Choosing." An experiment showed that when supermarket shoppers were presented with more choices for a category (e.g. jam), they stuck to the choices they were already familiar with.
> The reasons franchises, especially superheroes, have been thriving in the last two decades is because of the excessive choice of movies. The general public struggles to pick which one to watch, so they pick franchises with expected plots. This market trend is partly why Disney's Bob Iger bet on franchises, acquiring Marvel and Lucasfilms.
I don’t think this is true. Consolidation of film properties was largely caused by the death of the tape/dvd market, which represented a significant revenue stream for movies, so the studios focused on properties that would have huge box office numbers. Matt Damon talked about this in an interview recently, saying that so many of his movies, like Good Will Hunting, would not be made today because they tended to have long tail VHS sales.
I think there _is_ a scarcity of quality movies that don't follow predictable / mainstream cliches. Because its too risky to sink a lot of money into something niche. AI will change that. The number one thing it is going to do is reduce the time, effort, and money involved in creating quality films. The result will be more trash for sure. But also more high quality stuff, and a LOT more stuff tailored to narrower interests. And its probably going to be very cheap, and perhaps free for much of it.
We'll still see mainstream franchises though, because IMHO it serves as a form of social glue, i.e. gives people common things to talk about and experiences to share.
What makes you think that a machine specifically designed to generate outputs based on previously observed patterns will produce "quality movies that don't follow predictable / mainstream cliches"?
Three reasons. First, because it doesn't have to output the most common patterns, it can output the less common niche patterns, apply known pattens to more niche topics, or generate unique / uncommon combinations of patterns. Second, the winning pattern (combinations) are selected by humans (i.e. people viewing and sharing); each niche can select for the ones that are most interesting. Third, we won't be eliminating humans entirely. i.e. rather than big mainstream production studios, tiny studios or even one person "shops" with the tooling power of a production studio is what this tech will enable.
> I think there _is_ a scarcity of quality movies that don't follow predictable / mainstream cliches
Probably, but quality movies aren't the gumption behind cost cutting at Hollywood studios. If indie movie producers want to use new technologies, they should go for it, but they won't have access to background actors who are tied up behind large studio contracts, unless they just open source themselves – which seems slightly dystopian.
The studio executives are trying to extinguish quality movies because they only want predictable mainstream cliches that can churned out inexpensively in enormous volume from AI on a regular schedule.
All for indie producers to be able to self-publish their own CG masterpieces (that's already a thing that happens on YouTube), but they'll probably not use the puppeted likenesses of real people that have been grifted from them by egregious contracts from Hollywood Studios (as per TFA)
I've read a fair bit of fantasy fiction (think Lord of the Rings derivatives), and after roughly 25-30 books, it just becomes a slog to read through, there are very few original ideas.
Martin Luther also threatened the Catholic Church by translating the Bible from Latin into German. That was the first time people other than priests had wide access to the text directly.
A less dystopian avenue would be if the background actor to license his avatar per use. That is, the actor retains choice, just licenses his likeness for one movie only, and only for the movies the actor wants to be in.
I can't imagine a worse situation than when a studio could use likenesses in movies that the actors would be horrified to appear in.
Commercialization of someone else’s likeness sounds like the type of thing that should be heavily regulated, and skewed in favor of the person whose likeness is being exploited.
While I agree, there's another part of me that thinks, that's just going to siphon money into the already established and rich celebrities. Its all the non celebrity people that will be left out, and at the end of the day, that's what matters more. As a result, although I agree w/ you, I actually don't know how concerned I am about it.
Back in the 60s in my youth time, I expect creative works to be immune to AI. Mundane repeatitive things like ad, copywriters, announcer, API programmers, call centre, spams replace by AI first. Heck doctors wpuld be the first few to get affected as no human can store huge chunks of medical text and pattern matching better than machines (just like what happen to letter postal sorting). Instead, creative works now being likely sacrifice first over all my earlier expectation.
The studio talking about AI for actors at this exact moment might have an element of truth, but more likely it sounds like a negotiation tactic to devalue those striking actors.
In reality, AI actors still look like game cutscenes and human acting will continue to be the premium service.
> In reality, AI actors still look like game cutscenes and human acting will continue to be the premium service.
You may not have read far enough into TFA, which specifically notes: The reported proposal hinged on the ability for background actors to be “scanned, get paid for one day’s pay” and for that company to “own that scan of their image, their likeness, and to be able to use it for the rest of eternity in any project they want with no consent and no compensation.”
To continue with the video game analogy, studios are proposing to own your likeness, expressions, mannerisms, gait, and whatever else they can capture/model, for use as background NPCs in whatever projects they like, forever, for free.
These fights are often about what will come in the future. Remember, the initial fight over streaming residuals for writers went on during the last WGA strike in '07. The tech might be there now, but how about in 15 years?
The reported proposal hinged on the ability for background actors to be “scanned, get paid for one day’s pay” and for that company to “own that scan of their image, their likeness, and to be able to use it for the rest of eternity in any project they want with no consent and no compensation.”
…
“The endgame is to allow things to drag on until union members start losing their apartments and losing their houses,” Deadline reported one studio executive saying.
i worked as an extra for almost 10 years. averaged less than $20k/yr, but at least it was enough to scrape by while i got an education. ironically i now make $200k/yr working in ai for a pharma co. the studios want to pay a person only once to use their image forever. late stage capitalism at its finest. i’m sipping the last of anchor steam. what’s capitalism’s endgame? elon or jeff get all the money and head to mars while everyone else starves in streets lined with benioff’s empty houses?
Well don’t be silly, we’ll be post-employment and we’ll just pass life recreating and eating grapes all day.
In all seriousness, your opinion doesn’t seem to be popular here, but I was hoping to see some debate. I personally think you’re closer to the truth than a lot of the rose tinted opinions on how this’ll bring us closer to god.
The endgame - as advocated for by several people on this very page - is that AI takes all the creative and interesting jobs while us displaced humans exist on UBI funded by taxes on excess AI-sourced profits.
Which is fucking bullshit. What the hell are we doing trying to build a future where we transfer all the benefits of a sense of purpose and meaningful contribution to non-sentient AIs? It's insanity.
We wanted flying cars; we're getting Tom Cruise forever.
I'm kidding though - it'll be cool to have entirely personalized movies played by AI lookalikes. It'll probably make the human actor more money too, if they own a corporation that licenses their likeness.
Given there is 27 years between Mission Impossible 1 and 7, I think MI78 is going to be released around 2290, so at least VR Tom Cruise should be pretty good by then
This is the type of thing that requires government action. Without it this story will repeat many times, it’s Hollywood now, tomorrow it’ll be music artists, video game sound artists etc.
Better to pass a law to make it illegal or opt in worst case.
> This is the type of thing that requires government action. Without it this story will repeat many times, it’s Hollywood now, tomorrow it’ll be music artists, video game sound artists etc.
I don't think this is a thing that government can solve. The technology is world wide now, passing a law doesn't stop it. If you can hire an amazing acrobat from anywhere in the world and morph any actor from anywhere in the world as input, nothing stops you. Same with music.
I do think that there's some kind of "celebrity brain" in people that does want to be a part of a fan base, whether it's a musician or an actor there's a part of us that likes watching other people, and celebrities are very good at being those "other people" professionally. That's never going away.
Acting is a real skill! It's not standing around being yourself and getting paid for nothing. Maybe being a celebrity is not acting unless you are a celebrity hired to be an actor or an actor hired to be a celebrity being an actor being a celebrity or...
So maybe if there is some upside to all this, it's that actors may come up ahead of celebrities. The manufacturing of a top 10 list of celebrities in a cut throat competition is probably more efficient for a business than funding 1000 community theaters to generate a bunch of actors, because 10 people are easier to manage than 10000. I'm not optimistic about this strike ending in favor of the celebrities, they're expensive and annoying and possibly un-necessary? That's not a good bargaining position.
But as long as we are all human, art will never die.
This is the type of thing I don’t trust the government to act on. The people that would write the legislation have been bought and sold by the same people trying to make movies from cloned AI actors.
Regulatory capture by the likes of OpenAI et al would be a terrible thing. Imagine, open source AI devs suddenly needing an expensive governmental license to continue developing, it would crush open source.
Personally, I find the prospect of doing 1 day of work, and potentially having an impact on hundreds of various artistic works thrilling. It also seems a little bit of a missed argument to say that these workers are often forced into 18 hour days, as it bolsters the studio's potential arguments.
Maybe it's because of radical self-acceptance, (or low standards according to some social lens), but honestly I can't imagine a single role in any movie I would be offended to have. I consider it an honor to be attached to any piece of art, as it only increases the breadth of my identity's perseverance. If it's a piece of shit film, it is either never seen or hilarious in my mind.
Not to mention that these are background actors, hardly any uses will be problematic anyway.
As for losing the "stepping stone" into the industry, I think that is just a hyperfocused perspective on a broader societal problem transitioning to the postcapitalist AI singularity.
> Maybe it's because of radical self-acceptance, (or low standards according to some social lens), but honestly I can't imagine a single role in any movie I would be offended to have.
This is always relative. There are millions of people who would do anything to live in America and have a job cleaning bathrooms, just as there are many young people that want to write software professionally and would take any software job. But if you’re an experienced dev, you wouldn’t just take any old job, you’d look for something that fits your career goals.
If a studio starts using a scanned image of a background actor in multiple movies, it could become popular like the "Wilhelm Scream" enabling that actor to get frontline jobs.
> Personally, I find the prospect of doing 1 day of work, and potentially having an impact on hundreds of various artistic works thrilling.
At issue is compensation. The studios don't want to share the income they gain from your "impact". They want all the benefits of this new technology to accrue to themselves. Reminds me of self-driving cars coming with EULAs that forbid commercial/taxi use of self-driving.
I also have no problem owning a self driving Tesla with the provision that fully autonomous commercial use shares revenue.
Just like studios paying gfx artists to incorporate my visage, the Tesla engineers are doing all the work. I literally do nothing in both scenarios. Why would I be entitled to compensation?
I imagine that the reason car companies don’t want taxi or Uber to use self driving mode is because of liability reasons, not because they want a share of taxi companies profits. Taxi/uber drivers still have to pay for fuel/maintenance costs, and currently most self driving cars still require a driver in the seat (except for companies like Waymo and Cruise, that operate their own fleet).
So even if taxi companies were allowed to use self-driving mode in a car like a Tesla, they’d still need to pay a person to be in the drivers seat.
No, that analogy is flawed. Programming now is more like the studio recording a scene then being able to edit it however they want.
Imagine that you sat down and wrote some code to guide a wheelchair around bumps for one day, then your employer said “thanks, we’re done with you” and re-used your code to guide an autonomous killer drone. You might argue that’s technically possible now, but most developers would go out of their way to avoid it.
Fair point - my hasty response was over the top. I was thinking of someone whose likeness had been scanned being 'made to appear' as something like an SS soldier in a WW2 movie.
Thinking about it a bit more, it's not just that your code can be re-used in ways you haven't anticipated, but that you'll never get to code again - in that one day, all the code you could potentially write had been expressed, and there was no need to ever hire you personally again to code anything.
I wouldn't need any compensation to have my image used artistically. It would cost me literally nothing other than a day of novelty experience doing some poses and emotes for a camera array.
I imagine there are enough "scabs" like like me that this grueling 18 hour occupation doesn't need to exist.
The leader of the actor's guild in 1960 who signed off on that strike? Ronald Reagan.